Second option just won't work. The same controllers in two different requests don't share state.
Session vars are probably not flushed to disk until the end of request, that would explain why the first case is not working. Maybe
database driver for sessions would behave differently and update session values right away.
A quick-and-dirty way would be to write
v into a file periodically in your
P method. And read it in your
R method. But in case you can have multiple searches running simultaneously, you'll need to come up with a file naming method to avoid different searches writing to same file. Say, if a user can only run a single query at a time, derive file name from user id.
But a proper way would be to use queues/jobs to run your search functions. Set up some sort of
SearchCommand class that will handle the search logic, and a
SearchParams model which will contain search query and other parameters, and a field for progress value
v. So when a user submits a search request, you make a new
SearchParams based on search query, push
SearchCommand job with
SearchParams as data into the queue, and return the model id back to the front-end. You can then use that model id to query for progress data.
Hello Xum, thanks for your answer.
I set up a job for my search process and managed to make the basic search work (as it was without the Command object).
What do you mean by return the model id back to the front-end? What should return the model id, who should use it/store it and where?
What I did this far is:
$results = $search_command->handle($search_params)
I now seem to be facing the same problem as when I first came here: how do I store and access my SearchParams id from other AJAX calls?
If I use the queue as a 'real' queue (background processing), I may be able to return the SearchParams id back to my view immediately (so further AJAX calls can use it), but how can I then get the search results back to my controller/view when the job is done?
Why not put
$results back into
SearchParams when the search is done?
And add a method to your controller that, given an id of
SearchParams object, returns results from it.
That way you'll have AJAX calls to one method that returns progress info (and some sort of
is_complete status), and an AJAX call to another that returns the json/view with results.
Yes that's what I've began to do, but I have trouble setting up my queue (never had to use one before).
Default queue driver was 'sync' and it executed the job right away so I couldn't send the SearchParams id back to my view before the job was done.
I tried to set the driver to 'database' and created the 'jobs' and 'failed_jobs' tables, but it still seems to execute the job right after I queue it. The code is:
Queue::push(new SearchCommand($search_params)); // I also tried Queue::later with 1s from now return response()->json([ 'pid' => $search_params->id ]);
'jobs' and 'failed_jobs' tables remain empty. Did I miss some settings or am I simply not understanding the way queues works?
Again, thank you for your time Xum!
Just to make sure the
database driver is working, start
php artisan tinker and execute this line:
It will throw an error, something like
PHP warning: Missing argument 1 for Illuminate\Queue\SyncQueue::push() in ...
Check that you have
DatabaseQueue in that message, and not
Otherwise, I don't have much experience with Laravel's queues. Have you read the documentation?
sync driver executes jobs synchronously, thus the name. But
database driver should be asynchronous. And it needs a listener run via
artisan utility to execute the queued jobs.
SyncQueue in the error even though I have
'default' => env('QUEUE_DRIVER', 'database'), in my /config/queue.php. Weird.
Edit: my .env file isn't updating properly, the driver was still set to 'sync' in it. I changed it manually and I'm now trying to make my job work!
Everything seems to be working now! Nonetheless I have one last question that I just thought of: a queue being a queue and my search process being time consuming, if another user want to do a search request, will he have to wait for the first user's request to be done?
If the response is yes, which I think it may be, then that's gonna be a problem.
The maximum number of simultaneous searches is defined by the number of simultaneously running workers/listeners. So, starting several of them should solve your problem.
Sign in to participate in this thread!
We'd like to thank these amazing companies for supporting us