60-second timeout error is an expected error that can occur when exporting or processing using all
or get
on large amounts of data. To avoid this error, laravel has a method called chunk
that can be used to process large amounts of data in smaller chunks.
consider following example
$data = Flight::all();
$data->chunk(1000, function($data) {
foreach ($data as $item) {
// process data
}
});
office doc reference : https://laravel.com/docs/11.x/eloquent#chunking-results
You can use command line mysqldump -u your_username -p your_database_name > backup.sql to export database and same to import it will be a one go
Handling 8 lakh (800,000) products efficiently in Laravel requires optimizing both import/export logic and server configuration. Exporting such a large dataset in one go can lead to issues like timeouts or memory exhaustion. Exporting in chunks is the best solution.
Here’s how you can handle this:
Exporting data in chunks avoids loading all 8 lakh records into memory at once. You can use a queue or chunking feature of Laravel.
a. Use Laravel's chunk()
method
The chunk()
method processes the data in chunks, avoiding memory overload.
public function exportProducts()
{
$filePath = storage_path('exports/products.csv');
$handle = fopen($filePath, 'w');
// Add CSV header
fputcsv($handle, ['ID', 'Name', 'Price', 'Stock', 'Description']);
// Process products in chunks
Product::chunk(1000, function ($products) use ($handle) {
foreach ($products as $product) {
fputcsv($handle, [
$product->id,
$product->name,
$product->price,
$product->stock,
$product->description,
]);
}
});
fclose($handle);
return response()->download($filePath);
}
b. Use a Queue for Background Processing For large exports, use Laravel queues to run the export process in the background.
Create an export job:
php artisan make:job ExportProductsJob
Handle the export logic in the job:
public function handle()
{
$filePath = storage_path('exports/products.csv');
$handle = fopen($filePath, 'w');
fputcsv($handle, ['ID', 'Name', 'Price', 'Stock', 'Description']);
Product::chunk(1000, function ($products) use ($handle) {
foreach ($products as $product) {
fputcsv($handle, [
$product->id,
$product->name,
$product->price,
$product->stock,
$product->description,
]);
}
});
fclose($handle);
// Notify the user or send an email with the download link
}
Dispatch the job:
ExportProductsJob::dispatch();
Laravel Excel is a robust package for importing and exporting large datasets efficiently.
composer require maatwebsite/excel
Create an export class:
php artisan make:export ProductsExport --model=Product
Define the export logic:
namespace App\Exports;
use App\Models\Product;
use Maatwebsite\Excel\Concerns\FromQuery;
use Maatwebsite\Excel\Concerns\Exportable;
class ProductsExport implements FromQuery
{
use Exportable;
public function query()
{
return Product::query();
}
}
Use the export class:
public function exportProducts()
{
return (new ProductsExport)->download('products.xlsx');
}
If you're facing timeouts even after chunking, you might need to adjust server configurations:
Increase Execution Time:
Update your php.ini
file:
max_execution_time = 300
memory_limit = 512M
Use CLI Instead of HTTP Requests Run exports via Artisan commands to avoid web server timeouts:
php artisan export:products
For importing, process the file in chunks to avoid memory overload:
Create an import class:
php artisan make:import ProductsImport --model=Product
Define the import logic:
namespace App\Imports;
use App\Models\Product;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithChunkReading;
class ProductsImport implements ToModel, WithChunkReading
{
public function model(array $row)
{
return new Product([
'name' => $row[0],
'price' => $row[1],
'stock' => $row[2],
'description' => $row[3],
]);
}
public function chunkSize(): int
{
return 1000;
}
}
Use the import class:
public function importProducts(Request $request)
{
Excel::import(new ProductsImport, $request->file('file'));
return response()->json(['message' => 'Import successful']);
}
Let me know if you'd like further assistance!
You're encountering a 60-second timeout error because PHP scripts typically have a default time limit (max_execution_time
) of 30 or 60 seconds, which is not sufficient for exporting 8 lakh (800,000) products in one go.
Exporting such a large dataset in one go is not recommended. It consumes too much memory and processing time, which leads to timeouts or crashes. The optimal solution is to export the data in chunks, which allows PHP to process manageable portions without hitting time or memory limits.
If you're using a Laravel-compatible export package like Laravel Excel, you can use FromQuery
with WithChunkReading
:
use Maatwebsite\Excel\Concerns\FromQuery;
use Maatwebsite\Excel\Concerns\WithChunkReading;
class ProductsExport implements FromQuery, WithChunkReading
{
public function query()
{
return Product::query();
}
public function chunkSize(): int
{
return 1000; // Adjust based on performance
}
}
Then export:
return Excel::download(new ProductsExport, 'products.xlsx');
Queueing offloads the heavy task and runs it in the background:
ShouldQueue
)use Maatwebsite\Excel\Concerns\ShouldQueue;
class ProductsExport implements FromQuery, WithChunkReading, ShouldQueue
Make sure your queue system is configured (database
, redis
, etc.).
If you really want to try a one-go export (not recommended):
ini_set('max_execution_time', 0); // unlimited
ini_set('memory_limit', '-1'); // unlimited memory
⚠️ Risky on localhost or shared hosting, use only for testing.
Exporting 8 lakh products should be done in chunks, ideally using Laravel Excel with chunk reading and queuing. This ensures stability and better performance without hitting limits.
Let me know if you'd like help setting it up step-by-step.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community