Essentially I have an artisan command that processes a file into a database. While there's usually only a few files to process, sometimes there can be hundreds. While processing a few hundred is fine, around the 400-mark I start running into out-of-memory errors (as expected, PHP isn't designed for long-running processes).
To get around this I structured my command as below. I was half-expecting Command::call()
to run the command in another process, but that isn't the case.
public function fire() {
if ($this->option('all')) {
foreach ($this->getFiles() as $file) {
$this->call('command:name', ['--path' => $file]);
}
} elseif ($this->option('path')) {
$this->process($this->option('path'));
}
}
Does anyone know of a way to call an Artisan command in another process? Ideally I'd like to avoid something like shell_exec("artisan command:name --path=$path");
as it seems a bit hacky and depends on being in the project root - but that might be the way? I suppose I could use a queue but it seems a bit over-the-top.
The way we handle a similar flow (importing anywhere from 1 day to a few years of data), we have one master command that simply queues the smaller processes so they can be processed one by one. The actual flow is:
That way if one job fails, you don't lose the jobs behind it. You can also add more concurrency by adding more workers.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community