Hello laravel web developers, In this article, we'll see laravel 11 imports large CSV files. In this guide, I'm going to walk you through the process of handling large CSV files with ease using in laravel 11. Also, we'll use spatie/simple-excel composer package.
This package allows you to easily read and write simple Excel and CSV files. Behind-the-scenes generators are used to ensure low memory usage, even when working with large files.
Laravel 11 Import Large CSV Files
In this step, we'll install laravel 11 application using the following command.
laravel new my-laravel-project
spatie/simple-excel
PackageNext, let's install the spatie/simple-excel package via Composer. In your terminal, navigate to your project directory and run:
composer require spatie/simple-excel
we'll use SimpleExcelReader
to load it up. by default, it returns you a LazyCollection
– think of it as a more considerate way to handle your data without exhausting your server's memory. This means you can process the file bit by bit.
Now, we'll create a job using the following command.
php artisan make:job ImportCSV
App/Jobs/ImportCSV.php
<?php
namespace App\Jobs;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Spatie\SimpleExcel\SimpleExcelReader;
class ImportCSV implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
/**
* Create a new job instance.
*/
public function __construct()
{
//
}
/**
* Execute the job.
*/
public function handle(): void
{
SimpleExcelReader::create(storage_path('app/public/products.csv'))
->useDelimiter(',')
->useHeaders(['ID', 'title', 'description'])
->getRows()
->chunk(5000)
->each(
// Here we have a chunk of 5000 products
);
}
}
Here's is follow some steps to import large CSV files.
Chunking the CSV: We'll split the big file into smaller parts, making it easier to handle. This gives us smaller chunks to work with, which is much simpler.
Job Dispatching: We'll send each of these smaller chunks as separate tasks. This helps us manage the workload better and is gentler on your server.
Database Insertion: Then, we'll smoothly add each chunk to the database. It's a straightforward process that ensures everything goes in neatly.
With our LazyCollection setup, we'll break down the CSV into smaller parts. It's similar to slicing a huge sandwich into smaller, manageable pieces.
php artisan make:job ImportProductChunk
For each part of the CSV file, we'll create and start a job. These jobs act like dedicated workers, each handling a portion of the data and inserting it into your database with care.
<?php
namespace App\Jobs;
use App\Models\Product;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldBeUnique;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Database\Eloquent\Model;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Str;
class ImportProductChunk implements ShouldBeUnique, ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
public $uniqueFor = 3600;
/**
* Create a new job instance.
*/
public function __construct(
public $chunk
) {
//
}
/**
* Execute the job.
*/
public function handle(): void
{
$this->chunk->each(function (array $row) {
Model::withoutTimestamps(fn () => Product::updateOrCreate([
'product_id' => $row['ID'],
'title' => $row['title'],
'description' => $row['description'],
]));
});
}
public function uniqueId(): string
{
return Str::uuid()->toString();
}
}
Ensuring Uniqueness
One important thing to remember is to use $uniqueFor
and uniqueId
in your jobs. It's similar to giving each worker a special ID badge. This way, you ensure that no two workers end up doing the same job, which is bad for efficiency.
In our ImportCSV job, we'll send out a job for every chunk using each method.
<?php
namespace App\Jobs;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Spatie\SimpleExcel\SimpleExcelReader;
class ImportCSV implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
/**
* Create a new job instance.
*/
public function __construct()
{
//
}
/**
* Execute the job.
*/
public function handle(): void
{
SimpleExcelReader::create(storage_path('app/public/products.csv'))
->useDelimiter(',')
->useHeaders(['ID', 'title', 'description'])
->getRows()
->chunk(5000)
->each(
fn ($chunk) => ImportProductChunk::dispatch($chunk)
);
}
}
You might also like: