Support the ongoing development of Laravel.io →
posted 6 months ago
Database

raza9798 liked this thread

1

60-second timeout error is an expected error that can occur when exporting or processing using all or get on large amounts of data. To avoid this error, laravel has a method called chunk that can be used to process large amounts of data in smaller chunks.

consider following example

    $data = Flight::all();
    $data->chunk(1000, function($data) {
        foreach ($data as $item) {
            // process data
        }
    });

office doc reference : https://laravel.com/docs/11.x/eloquent#chunking-results

0

You can use command line mysqldump -u your_username -p your_database_name > backup.sql to export database and same to import it will be a one go

0

Handling 8 lakh (800,000) products efficiently in Laravel requires optimizing both import/export logic and server configuration. Exporting such a large dataset in one go can lead to issues like timeouts or memory exhaustion. Exporting in chunks is the best solution.

Here’s how you can handle this:


1. Export Data in Chunks

Exporting data in chunks avoids loading all 8 lakh records into memory at once. You can use a queue or chunking feature of Laravel.

Steps to Export in Chunks:

a. Use Laravel's chunk() method The chunk() method processes the data in chunks, avoiding memory overload.

public function exportProducts()
{
    $filePath = storage_path('exports/products.csv');
    $handle = fopen($filePath, 'w');

    // Add CSV header
    fputcsv($handle, ['ID', 'Name', 'Price', 'Stock', 'Description']);

    // Process products in chunks
    Product::chunk(1000, function ($products) use ($handle) {
        foreach ($products as $product) {
            fputcsv($handle, [
                $product->id,
                $product->name,
                $product->price,
                $product->stock,
                $product->description,
            ]);
        }
    });

    fclose($handle);

    return response()->download($filePath);
}

b. Use a Queue for Background Processing For large exports, use Laravel queues to run the export process in the background.

  1. Create an export job:

    php artisan make:job ExportProductsJob
    
  2. Handle the export logic in the job:

    public function handle()
    {
        $filePath = storage_path('exports/products.csv');
        $handle = fopen($filePath, 'w');
    
        fputcsv($handle, ['ID', 'Name', 'Price', 'Stock', 'Description']);
    
        Product::chunk(1000, function ($products) use ($handle) {
            foreach ($products as $product) {
                fputcsv($handle, [
                    $product->id,
                    $product->name,
                    $product->price,
                    $product->stock,
                    $product->description,
                ]);
            }
        });
    
        fclose($handle);
    
        // Notify the user or send an email with the download link
    }
    
  3. Dispatch the job:

    ExportProductsJob::dispatch();
    

2. Use Laravel Excel Package

Laravel Excel is a robust package for importing and exporting large datasets efficiently.

Installation

composer require maatwebsite/excel

Export Example

  1. Create an export class:

    php artisan make:export ProductsExport --model=Product
    
  2. Define the export logic:

    namespace App\Exports;
    
    use App\Models\Product;
    use Maatwebsite\Excel\Concerns\FromQuery;
    use Maatwebsite\Excel\Concerns\Exportable;
    
    class ProductsExport implements FromQuery
    {
        use Exportable;
    
        public function query()
        {
            return Product::query();
        }
    }
    
  3. Use the export class:

    public function exportProducts()
    {
        return (new ProductsExport)->download('products.xlsx');
    }
    

3. Optimize Server Configuration

If you're facing timeouts even after chunking, you might need to adjust server configurations:

  1. Increase Execution Time: Update your php.ini file:

    max_execution_time = 300
    memory_limit = 512M
    
  2. Use CLI Instead of HTTP Requests Run exports via Artisan commands to avoid web server timeouts:

    php artisan export:products
    

4. Importing Products Efficiently

For importing, process the file in chunks to avoid memory overload:

Example: Using Laravel Excel for Import

  1. Create an import class:

    php artisan make:import ProductsImport --model=Product
    
  2. Define the import logic:

    namespace App\Imports;
    
    use App\Models\Product;
    use Maatwebsite\Excel\Concerns\ToModel;
    use Maatwebsite\Excel\Concerns\WithChunkReading;
    
    class ProductsImport implements ToModel, WithChunkReading
    {
        public function model(array $row)
        {
            return new Product([
                'name' => $row[0],
                'price' => $row[1],
                'stock' => $row[2],
                'description' => $row[3],
            ]);
        }
    
        public function chunkSize(): int
        {
            return 1000;
        }
    }
    
  3. Use the import class:

    public function importProducts(Request $request)
    {
        Excel::import(new ProductsImport, $request->file('file'));
    
        return response()->json(['message' => 'Import successful']);
    }
    

Key Takeaways

  • Always process large datasets in chunks to reduce memory usage.
  • Use Laravel Excel for handling large-scale imports and exports efficiently.
  • Optimize server settings for large file processing.
  • For long-running tasks, leverage queues to prevent timeouts.

Let me know if you'd like further assistance!

0

You're encountering a 60-second timeout error because PHP scripts typically have a default time limit (max_execution_time) of 30 or 60 seconds, which is not sufficient for exporting 8 lakh (800,000) products in one go.

✅ Best Practice: Export in Chunks

Exporting such a large dataset in one go is not recommended. It consumes too much memory and processing time, which leads to timeouts or crashes. The optimal solution is to export the data in chunks, which allows PHP to process manageable portions without hitting time or memory limits.

🛠️ Solution Options:

1. Use Chunking with Laravel Collections

If you're using a Laravel-compatible export package like Laravel Excel, you can use FromQuery with WithChunkReading:

use Maatwebsite\Excel\Concerns\FromQuery;
use Maatwebsite\Excel\Concerns\WithChunkReading;

class ProductsExport implements FromQuery, WithChunkReading
{
    public function query()
    {
        return Product::query();
    }

    public function chunkSize(): int
    {
        return 1000; // Adjust based on performance
    }
}

Then export:

return Excel::download(new ProductsExport, 'products.xlsx');

2. Queue the Export Job

Queueing offloads the heavy task and runs it in the background:

  • Run the export as a queued job (ShouldQueue)
  • Show a download link once ready
use Maatwebsite\Excel\Concerns\ShouldQueue;

class ProductsExport implements FromQuery, WithChunkReading, ShouldQueue

Make sure your queue system is configured (database, redis, etc.).

3. Increase Time Limits (Temporary Workaround)

If you really want to try a one-go export (not recommended):

ini_set('max_execution_time', 0); // unlimited
ini_set('memory_limit', '-1');    // unlimited memory

⚠️ Risky on localhost or shared hosting, use only for testing.


🔚 Conclusion:

Exporting 8 lakh products should be done in chunks, ideally using Laravel Excel with chunk reading and queuing. This ensures stability and better performance without hitting limits.

Let me know if you'd like help setting it up step-by-step.

0

Sign in to participate in this thread!

Eventy

Your banner here too?

Divesh Kumar diveshr Joined 6 Jan 2025

Moderators

We'd like to thank these amazing companies for supporting us

Your logo here?

Laravel.io

The Laravel portal for problem solving, knowledge sharing and community building.

© 2025 Laravel.io - All rights reserved.