Support the ongoing development of Laravel.io →
Configuration Input Database
Last updated 1 year ago.
0

This worked!

DB::connection()->disableQueryLog();

But what do you think about my code to import csv files like that way?

Last updated 1 year ago.
0

hello?

Last updated 1 year ago.
0

Not on topic but if you are the women on the picture, you are really beautiful.

Last updated 1 year ago.
0

eriktisme said:

Not on topic but if you are the women on the picture, you are really beautiful.

Smooth man. Really smooth. Help her or him (its irrelevent this is a coding forum not pof) with their issue and win her or his love!

Last updated 1 year ago.
0

merkabaphi said:

But what do you think about my code to import csv files like that way?

I prefer query builder or raw queries over eloquent while working with large data.

Last updated 1 year ago.
0

If it's a one off thing, you can use Sequel Pro / PHPMyAdmin (or other interface depending on what database implementation you are using) to import it into the database directly

But if not I would go with what mgsmus said, and just query the database directly

And I think you are reassigning the $product variable on each loop (array -> Eloquent)? And doing an unneccessary Product::find in the else section because it should already be found if the 'if condition' failed.

Last updated 1 year ago.
0

eriktisme said:

Not on topic but if you are the women on the picture, you are really beautiful.

Keep it professional, please.

Last updated 1 year ago.
0

Try goodby/csv.

Last updated 1 year ago.
0

bencorlett said:

Try goodby/csv.

Thanks for this!

Last updated 1 year ago.
0

Why don't you run the import as a Queue ? And if the import is just locally, you can write your own artisan command and trigger it in the console, instead of hitting the server.

http://laravel.com/docs/queues

http://laravel.com/docs/commands

Last updated 1 year ago.
0

2 things that will help:

  1. Save memory by loading the csv file one row at a time instead of loading the whole thing into one big array in memory. goodby/csv would work, personally I like ddeboer/data-import.

  2. Speed up the database by using prepared statements. Laravel will not help you here but it's easy enough just to use PDO for something this simple (unless your Product model is doing something complicated).

$select_stmt = DB::getPdo()->prepare('SELECT id FROM products WHERE sku = ?');
$insert_stmt = DB::getPdo()->prepare('INSERT INTO products(price, old_price) VALUES(?, ?)');
$update_stmt = DB::getPdo()->prepare('UPDATE products SET price = ?, old_price = ? WHERE id = ?');

foreach($products as $key => $product)
{
    $select_stmt->execute([ $product['sku'] ]);
    $id = $select_stmt->fetchColumn();

    if ($id === false)
    {
        $insert_stmt->execute([ $product['productPrice'], $product['productOldPrice'] ]);
    }
    else
    {
        $update_stmt->execute([ $product['productPrice'], $product['productOldPrice'], $id ]);
    }
}
Last updated 1 year ago.
0

100 times faster is to use MySQL's "LOAD DATA IN FILE" :)

Last updated 1 year ago.
0

This should help. - http://csv.thephpleague.com/

Last updated 1 year ago.
0

merkabaphi said: It takes too long and the memory of the server gets full. Does anyone knows how to solve this? Thanks!

I believe this could be a culprit contributing to your out-of-memory issue:

$products = csvToArray('products.csv');

It appears this method is parsing the CSV file into an array. This has the effect of reading and loading the entire file, which if it is a large file, could consume too much memory and therefore cause your out of memory issue. Thus, it would be better to process each row of the file as it is being read. See the PHP doc to get an idea of how you might do that: http://php.net/fgetcsv

Good luck!

John Madson

Edit:

Looks like https://github.com/goodby/csv will do this for you, which was already recommended by somebody else. I second that recommendation, as it will execute a callback on each line of the CSV file, preventing the issue I described above.

Last updated 1 year ago.
0

Thanks for this! . this work!!

Last updated 1 year ago.
0

Thank you all for the info! This was a great help.

0

Also important thing is to use Database Transactions if the MySQL engine is Innodb. This will speed up the process a lot!.

DB::transaction(function () use($products) {
     foreach($products as $key => $product)
     {
        // save here.
     }
});
0

Sign in to participate in this thread!

Eventy

Your banner here too?

merkabaphi merkabaphi Joined 25 May 2014

Moderators

We'd like to thank these amazing companies for supporting us

Your logo here?

Laravel.io

The Laravel portal for problem solving, knowledge sharing and community building.

© 2024 Laravel.io - All rights reserved.