If you running it as a foreground task in the command line, you can use Ctrl + C/Z (can't remember which C seems to work for me most of the time). If that doesn't work, then try killing the PHP process from the command line. If you running your app on Linux, you can do
ps aux | grep php
then
kill -9 process-pid
@ndy40 Yes, I think that's the solution, mine is Ctrl+C. The other issue is that I had Supervisor running in the background to keep the listener open. However, it seems that Supervisor (or something) was was caching the original config parameters. Which doesn't make sense, as it seems Supervisor directly calls php artisan queue:listen.
With that being said, I shut down the Supervisor script, and then queue:listen and Crtl+C a couple times seemed to flush the issue out of the system. It's a bit of mystery and I'm looking for a better way to deal with this in the future.
Thank you for the response.
Is it safe to simply kill the process? What if it's in the middle of processing a job?
"
Is it safe to simply kill the process? What if it's in the middle of processing a job?
"
It should always be safe to kill a job, jobs should be coded like that, always assume it will happen.
@etan-nitram -- sorry to hijack this thread. I saw you posted an exact issue I am having regarding SQS giving you an error when using multiple queues in priority. The issue is now a 404 on github: https://github.com/laravel/laravel/issues/2920 . Wondering if you had solved this. Please PM me if you wouldn't mind?
"It should always be safe to kill a job, jobs should be coded like that, always assume it will happen."
Really?? What if there is a very complicated task to be done via a queue job that can't simply be killed and picked back up... There has to be a safe way to do it, somehow.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community