Answer the question
In order to leave comments, you need to log in
Can Celery write the result to a database on another host?
The situation is this. There is a server ( server1 ) with django , mysql subd and Celery (+django-celery). Also, several workers are running on this server.
One task ( task0 ) starts according to the schedule, makes a request via django orm, receives data, splits it into separate parts and sends it for processing ( task2 ) to other workers using the rabbitmq broker . They receive data from the queue, perform certain manipulations with the data, and save the results-objects in the database. Everything works great
on one server , the question is the following. server2
If I need to install the workers on another machine ( ), then they will essentially need access to the database that is on server1 .
Do I understand correctly that it will be necessary to clone the entire project on server2 , change the db connection day IP in settings.py , and start the workers? It all looks not very tempting, maybe there is some other way?
Read this :
If you want to store task results in the Django database then you still need to install the django-celery library for that (alternatively you can use the SQLAlchemy result backend).
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler',
Answer the question
In order to leave comments, you need to log in
I myself recently deployed Celery. So far, too, everything is spinning on one server.
Everything will work great. Ideally, also put the database on a separate machine. And connect to it both from one server and from another.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question