This is not really Laravel specific issue IMO. You're just getting out of memory I guess.
I would like to ask you. Are you aware of Queues? If not, please take a look at it. http://laravel.com/docs/queues
If you're and still having this issue, what's the problem?
Yeah, you are right. Not exactly a Laravel specific question. I'm really just curious if there are others who have experience dealing with these types of limitations and scaling a Laravel 4 application in a production environment.
I would love to get some time in with Queues. Really, I keep coming across more and more situations where they look to be a viable solution. The trick is implementing this "crazy new technology" into an existing stack/culture ;)
It seems like Queues are my next logical step. I'll be sure to look more into them.
I don't use Laravel at work, but we use queue system a lot. We use queues for sending emails, sitemap re-caching, or anything that can block or slow down user's perceived performance.
For the heavy queries like yours, we take care of those tasks in our mysql replication server to make sure we don't interrupt performance on the production server.
Yes, I think queues can do the job.
I'm not sure how queues would directly solve the issue though? At the end of the day, whether you use queues or something else, you'd have to start by handling smaller sets of the 750k results. So perhaps (suedo code, obvoiusly)
$count = total number of records
while($page * 10000 < $count) load 10000 records from the db; iterate over records and add a queue entry for further processing sometime in the future
done
Or what's the idea with the queue here?
@develpr // Yes, but do you really want to do that in a single execution? I recommended a queue thinking that @erikthedeveloper is running the query on the consumer facing site. It looks like @erikthedeveloper is running the query in the backend, but it would be nice to split the executions nonetheless. Not to mention whoever runs the query has to wait for the script to finish.
P.S: Oh..I miss read your comment. Yes, using a queue will not solve it if you still run the query in a single execution. Like you mentioned, the query should be separated into multiple execution.
Checkout the chunk method:
Mymodal::chunk(500, function($results) {
foreach($results as $result) {
}
});
Should simplify what you have and give you the same outcome
If you're storing all your results in an array or building an array of that size, you're most likely running out of memory.
Consider looking into the PHP Iterators - http://us2.php.net/manual/en/spl.iterators.php
Thanks for the input/insight. @moon0326 & @develpr You guys are right about both:
a.) Splitting up the "job" (i.e. 750,000) into smaller chunks
b.) Utilizing queues to offload the job of processing the then manageable chunks
Really, mostly a theoretical situation, just trying to visualize how one might handle it using PHP/Laravel.
Thanks for the mention about iterators @jakefolio. I'll be sure to check that out. I was considering something along these lines yesterday...
I'm also trying to envision the relationship/difference between doing a (super rough example...)
$result = mysql_query("SELECT id, name FROM my_models");
while ($row = mysql_fetch_array($result, MYSQL_BOTH)) {
# ...
}
and
while ( MyModel::select('id', 'name')->get() ) {
# ...
}
Let me clarify, I am not necessarily doing either of these now. I am just trying to see the relationship between iterating through Eloquent/Fluent results and a traditional MySQL resource.
At the point of calling ~~~ $query_from_builder->get() ~~~, it is my understanding that the query is then executed and the results are then stored as a Collection (which is basically (loosely!) a hyped up array). Which if I understand correctly is going to require a lot more memory to store/iterate through than a traditional MySQL resource.
Forgive me if I am totally off base here. I'm really just thinking out loud and trying to better understand:
a.) What is really happening under the hood with the Eloquent -> Fluent -> Builder -> Execute Query -> Iterate Over Results/Resource.
b.) Possibly some best practices for approaching and handling large scale challenges with MySQL and/or PHP and/or Laravel.
If you're worried about memory usage of creating Eloquent models for a lot of rows, I would think about why I want to use Eloquent object in the first place.
I mean, if you're querying 750,000 I think you're querying that for um..reporting or backup purpose? If that's the case..I would not bother using Eloquent. I would just use raw queries. For the reports and backup purposes, you don't really need Eloquent models.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community