Answer the question
In order to leave comments, you need to log in
How to deal with race condition in laravel queue with multiple workers?
There is a Job that takes an Eloquent model into a handler and does some useful work.
After completion of the job, the model is deleted.
There is a supervisorctl that starts four worker processes (artisan queue: work), they all process one queue.
Problem: Workers are in a race condition and take on the same job for execution. This is indicated by errors in the log that the model passed to the handler does not exist.
Tried with DB drivers and Redis the problem is the same. If you use the db driver, then mysql deadlocks also fall into the log, which actually also indicates that two workers took the same job.
What the hell?)))
Googled - a lot of people complain and no one can decide.
The only crutch solution is to make a turn for each worker, and implement some kind of balancer in your code to distribute tasks of the same type into different queues. Something like ->onQueue('my_queue_name'.rand(1,4))
Isn't there a normal solution?
Answer the question
In order to leave comments, you need to log in
If artisan queue:work is such a crutch that it takes the same task from the queue (from the queue!!! this is its main purpose!) Can it be thrown out as professionally unsuitable?
If it is impossible to throw out, then you can hang a lock at the level of the radish - block the Job at the beginning of processing and release it at the end. https://redis.io/topics/distlock
I use a great solution from a friend. It uses optimistic locking, https://github.com/ph4r05/laravel-queue-database-ph4
Tested on multiple projects
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question