Support the ongoing development of Laravel.io →
Database Eloquent Laravel

Hi Laravel community,

I’m currently working on a Laravel project where I need to fetch around 20,000 rows from a MySQL database. The data will later be processed for a report generation feature. I want to ensure that I’m implementing this in the most efficient and optimized way to avoid potential performance issues.

Here are a few details:

The table has around 100,000+ rows, and the query includes some joins and conditions. The fetched data needs to be used for generating a CSV report. I’ve come across a few approaches like using chunk(), cursor(), or paginate(), but I’m unsure which would be the best in this scenario. I’d appreciate advice on:

The best way to handle this volume of data in Laravel. Whether I should use Eloquent, Query Builder, or raw SQL for this purpose. Any tips to optimize the query or the fetching process. Thank you in advance! Looking forward to your suggestions.mysq

0
moderator

Hello @abduruntime ,

For the amount of 20 0000 rows it is a good option to use a chunk or cursor, the best option depends on your query.

To be honest a join is most of the time very inefficient and it can be advisable to see if you can rewrite it to a query that doesn't use joins.

The part about Eloquent, Query Builder or raw is also the order that I advice :)

If possible without problems: Use Eloquent. Else try the Query Builder. If both is not an option then you need to go to a raw query.

What I see in my projects is most of the times a mix from Eloquent and the query builder.

0

Sign in to participate in this thread!

Eventy

Your banner here too?

Moderators

We'd like to thank these amazing companies for supporting us

Your logo here?

Laravel.io

The Laravel portal for problem solving, knowledge sharing and community building.

© 2025 Laravel.io - All rights reserved.