Support the ongoing development of Laravel.io →
Laravel Queues
Last updated 3 months ago.
0

Hey!

You could use caching/database columns combined with failed() methods in your jobs for this. You‘d write the url of the current job into the cache or database table with the id of the job that‘s currently using it. When the job is done, you can remove the jobid from that column. If the job fails, you could also remove the jobid from it. You‘d also have to use retry or backOff for jobs so they can wait and/or retry if they fail.

If no api endpoint is available (all have a jobid set), let the job fail and remove the jobid from that api endpoint column. The retry/backOff will kick in and the job will be retried after the specified amount of time for the specified amount of retries.

To summarize:

  • Create a new database table or cache item with api url and job id columns/array items.
  • in the handle method of the jobs, query the cache to get an available api url (one that has no job id set) and store it inside a class variable (you need to update this in the failed method)
  • in the failed method of the job you can remove the job id from the api url and by that release the api url and make it available to another job
  • add retry variable and backOff method (you can also use the other methods of making the job retry itself)

I hope this is somehow understandable. Maybe someone else has an easier way of doing this or more experience with queues than I have.

2
Solution selected by @driesvints

This is a good idea. With this I have something to work on. I really had no clue how to do it.

1

Hello.

I implemented this idea and it worked. But there are some problems with it.

The main problem is that when I start the four queue:worker and they start to process jobs, the first four jobs start at the same time, they access the DB at the same time and also choose the same element from the DB at the same time.

This happens because there is no time for the jobs to change the data in the DB and reserve the item for themselves.

Do you have any idea how to fix this? Prevent jobs from getting the same item from the DB when accessing at the same time.

0

Hey!

You could have a look at database locks: https://laravel.com/docs/master/queries#pessimistic-locking This would lock the item for updates as long as it's in use by the job. You'd then have to unlock it by updating the database column with the id from the job and removing the job id. Then the database item would be free and lockable again.

Is this somehow understandable?

0

Sign in to participate in this thread!

LoadForge

Your banner here too?

Moderators

We'd like to thank these amazing companies for supporting us

Your logo here?

The Laravel portal for problem solving, knowledge sharing and community building.

© 2022 Laravel.io - All rights reserved.