I have a strange thing that is happening where I am getting duplicate key errors occasionally - most of the time it is in a database session call, but occasionally it is elsewhere. I am guessing this is because I have some pages with rapid ajax calls. I also have a query pair that updates like:
\App\model::were('user_id',$user_id)->delete();
\App\model::insert([
[
'user_id'=>$user_id,
'value'=>1,
]...
]);
where there is a unique contstraint between user_id and value - but every so often I get a duplicate key error on this, again probably a race condition maybe brought on by the database taking longer to perform the insert, I'm guessing I have two threads that are:
Thread1:delete Thread2:delete Thread1:insert Thread2:insert - error, key already exists
Normally I would use replace instead of insert (I'm doing the delete/insert to try to mimic the replace functionality). I don't want to use raw queries because the list of inserts can be fairly long.
In both of these cases, is there a best practices solution that I'm not thinking of?
I've successfully resolved this by wrapping the pair of queries in a transaction. Not as optimal as I had hoped, but at least it has appeared to stop the race condition. It does appear to have increased my server load a bit, but I'm still investigating whether it was this or something else.
Just as a follow-up, using this method I am now getting deadlocks, so it did not solve the issue.
I have created a package that gives a trait you can apply to your models and have REPLACE and INSERT IGNORE queries. It's really just a macro but it saves the challenge of having to build a raw query and its bindings every time you need it.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community