I tried with DB::table() instead of the Model. I tried to add a ->take(20000) but it doesn't seem to be taken into account. I echoed the query with ->toSql() and eveything seems to be fine (the LIMIT clause is added when I add the ->take() parameter).
I don't think this is because of Laravel. Is it possible that you are hitting the PHP memory limit?
It's unlikely : I don't have any error/notice/warning (with full debug turned on), and memory_get_peak_usage says about 50Mb (and my PHP can use up to 1Gb RAM).
I'm currently implementing it via DB::connection()->getPdo(); to avoid using chunk, but I'd really like to understand the problem.
Is your closure modifying the records you're selecting? I imagine chunk uses offset and limit, so if your closure modifies the records such that they no longer fit the selection criteria then the offset will be skipping the results you'd expect in your next chunk
I have had the same issue, it seems when your DB closes the connection the next call to "chunk" believes there are no more records and returns - try using "DB::reconnect()" after each chunk
Update: Reconnecting didn't work => I used a "SELECT ... WHERE is_sent = 0", and after each chunk i updated the records to "is_sent = 1".Because i modified the records IN the loop, it seems that the chunk method didn't find more results and returned => Surrounding the chunk method with a while loop worked => http://stackoverflow.com/a/33798719
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community