My 2 cents,
Option 1 - Split your inserts to let's say 10,000 every time.
Option 2 - I sometimes loop thru and echo out the insert statement to something like less than 1000 rows and insert directly into MySQL (I don't think this is a good idea for 75k rows tho ... ) lol
insert into table (...) values (...);
insert into table (...) values (...);
insert into table (...) values (...);
Option 3 - If you are importing from .sql you should really try the following, this way is way faster than using PHP
mysql -u username -p -h localhost DATA-BASE-NAME < data.sql
Option 4 - Looks like you are having an XML document, I never tried this tho, but you could take a look at http://dev.mysql.com/doc/refman/5.0/en/string-functions.html#function_load-file
But I mean, ultimately for importing big data, PHP is not a very good choice. :-(
Thank you awsp. I will try to increase the amount of values inserted at once. My last try was with 25 Values each insert, because I read an article that this value has a good performance. Option 3 is not a choise, you are correct I'm importing data from xml. Option 4 seems also not really helpful.
Any other ideas?
I tested it now with 1000 rows each insert. But I get the same slowing down performance. When I stop importing and start the script again, then again the first thousands inserts go fast and slow then down no matter how many products are already in the table before starting the script. So indexes should not be the problem.
So what could I do? One idea is to stop after 10000 inserts and then run the script again. But I want to create a cronjob and just have a webspace without ssh and shell_exec-function is disabled, soI can't use this to start and finish scripts in a loop.
Any other idea?
Looks like you are doing insertion directly to your web host. (?)
Maybe you should first import XML file to your local machine and export to .sql file and do option 3 I mentioned before and insert to your web host database.
Like I wrote I want to do a cronjob, that runs every day to import the products from a webservice. So at least I will have to use a "own" webserver with ssh, because it does not end with 75000 products...at least it could be over 1 000 000 products.
I just worked on a importer which had to import data from 8 MsSql tables with 180k-450k rows each. At the moment we can do the full import in about 20 minutes.
What we have learned:
We used Zend but it shouldn't make a difference.
@Kompas: Thanks for your advices. :) Can you tell me what webserver (dedicated or virtual, size of ram, nr cores, etc) you are using for yur application?
And don't forget to disable the query log (\DB::disableQueryLog();
)
You should see if switching to mysqli driver speeds things up. It is supposed to be faster but I have never benchmarked it.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community