Hello.
I have a problem that I don't know how to solve. I have a job that needs to make calls to an api.
I have 4 different urls of the same api.
Example:
http://myapi1.com/proccess http://myapi2.com/proccess http://myapi3.com/proccess http://myapi4.com/proccess
So I use Supervisor to run the queue worker in 4 processes.
How can I make each process use a different api url for the jobs?
I want some way that when I start a job it checks that url is not in use and choose it to use it in the job.
So when I enter many jobs they are divided between the four processes but each process uses a different api url.
So there will not be two or more jobs running at the same time using the same api url.
Hey!
You could use caching/database columns combined with failed() methods in your jobs for this. You‘d write the url of the current job into the cache or database table with the id of the job that‘s currently using it. When the job is done, you can remove the jobid from that column. If the job fails, you could also remove the jobid from it. You‘d also have to use retry or backOff for jobs so they can wait and/or retry if they fail.
If no api endpoint is available (all have a jobid set), let the job fail and remove the jobid from that api endpoint column. The retry/backOff will kick in and the job will be retried after the specified amount of time for the specified amount of retries.
To summarize:
I hope this is somehow understandable. Maybe someone else has an easier way of doing this or more experience with queues than I have.
This is a good idea. With this I have something to work on. I really had no clue how to do it.
Hello.
I implemented this idea and it worked. But there are some problems with it.
The main problem is that when I start the four queue:worker and they start to process jobs, the first four jobs start at the same time, they access the DB at the same time and also choose the same element from the DB at the same time.
This happens because there is no time for the jobs to change the data in the DB and reserve the item for themselves.
Do you have any idea how to fix this? Prevent jobs from getting the same item from the DB when accessing at the same time.
Hey!
You could have a look at database locks: https://laravel.com/docs/master/queries#pessimistic-locking This would lock the item for updates as long as it's in use by the job. You'd then have to unlock it by updating the database column with the id from the job and removing the job id. Then the database item would be free and lockable again.
Is this somehow understandable?
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community