Answer the question
In order to leave comments, you need to log in
Django + Celery + RabbitMQ in a cluster. In the queue we have duplicates from periodic_task tasks
Two copies of Django(1.3) applications are running on two nodes.
Two have Celery(2.4) installed with RabbitMQ(2.7.1) backend.
RabbitMQ replicate with each other.
The application has periodically performed tasks. Let's say every hour you need to download and parse a document.
So at one point in time, each of the two applications adds a task to the queue (for the same task). And so we download the document twice and parse it twice.
How to avoid it???
Answer the question
In order to leave comments, you need to log in
There doesn't seem to be a solution out of the box.
You will have to make some homemade lotion like this:
celery.readthedocs.org/en/latest/userguide/periodic-tasks.html
“You can also start celerybeat with celeryd by using the -B option, this is convenient if you only intend to use one worker node"
python manage.py celeryd --help
"Also run the celerybeat periodic task scheduler. NOTE:
Only one instance of celerybeat must be running at any
one time."
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question