Answer the question
In order to leave comments, you need to log in
What is the magic of Celery using Docker?
A project with two services on django and flask got to the point where some requests require a stratum to run heavy asynchronous tasks. Previously, I used Celery + Django with the Redis broker in a small project to work on a scheduled task. But...
I can't figure out how it works inside Docker containers
Let's say there are 3 containers:
- django_web
- redis
- celery_worker
django_web and celery_worker use redis as a broker. Tasks are defined in the django_web container
Question: How does the Celery worker in the celery_worker container perform the tasks? The worker container does not have access to functions and operations. Is it possible that all the contents of the tax are somehow transmitted through the broker and it does not matter to him where the code is located? Or only the name+signature is passed.
I can't find a practice on how to properly use Celery in containers
Answer the question
In order to leave comments, you need to log in
Wow wow, I'll be migrating celery to docker myself one of these days.
But your question is a bit off topic. It's more about understanding how celery actually works.
Only a description of what task to perform, with what input parameters. No code is transmitted.
Take Django + Celery.
A typical usage pattern described in the documentation and used by celery itself to integrate with django is as follows: you create a tasks.py file in django applications, in which functions are created, decorated with @task. In these functions, you describe the logic of these jobs.
In this case, the web application and the workers use the same code. And the django_web and celery_worker containers need to pack the same thing. It will just run according to different scenarios: django_web will process http requests, celery_worker will process incoming tasks.
I recently did tasks on RQ - everything is very simple and comfortable, though there is not much that is in celery, but I don’t need it. - python-rq.org I
recommend, if you have questions - ask.
And yes, what's the problem with docker? The method name and parameters are passed, since all workers have access to the same code as the publisher, the worker can easily call the desired method with the necessary arguments, just take care of external resources such as a connection to the database before calling the worker instance.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question