Support the ongoing development of →
posted 10 years ago
Last updated 2 years ago.

The way we handle a similar flow (importing anywhere from 1 day to a few years of data), we have one master command that simply queues the smaller processes so they can be processed one by one. The actual flow is:

  1. Either run a cron every night that runs php artisan import, or manually run the command
  2. The ImportCommand will loop through all accounts and either the previous day or a date range if specified
  3. For each day that needs to be imported, we queue a separate job on a queue
  4. Our queue listener will pull from that queue and process days individually

That way if one job fails, you don't lose the jobs behind it. You can also add more concurrency by adding more workers.

Last updated 2 years ago.

Sign in to participate in this thread!


Your banner here too?

rmasters rmasters Joined 9 Dec 2013


We'd like to thank these amazing companies for supporting us

Your logo here?

The Laravel portal for problem solving, knowledge sharing and community building.

© 2024 - All rights reserved.