L
L
Leo2015-04-20 23:13:22
Django
Leo, 2015-04-20 23:13:22

How to get rid of memory leak in Django+Celery bundle?

Django 1.7.7, Celery 3.1.17, supervisor, rabbitmq.
The problem is that the workers perform tasks, but the RAM is not freed.
Tried

CELERYD_TASK_TIME_LIMIT = 600,
CELERY_TASK_RESULT_EXPIRES = 360,

But the situation still does not change.
user    1066  0.0  7.9 214576 41716 ?        S    16:09   0:10 project/manage.py celery beat -s project/celerybeat-schedule
user    1067  0.0  8.2 138876 43032 ?        S    16:09   0:17 project/manage.py celeryd --concurrency=1
user    1068  0.0  7.0 258700 37012 ?        Sl   16:09   0:11 project/manage.py celerycam --frequency=10.0
user    1090  0.0 13.0 250740 68220 ?        S    16:09   0:13 project/manage.py celeryd --concurrency=1

And every time they eat more and more RAM. The tasks are very simple - sending mail, working with the database.
How can this problem be solved?
Thank you!

Answer the question

In order to leave comments, you need to log in

2 answer(s)
Y
Yuri Shikanov, 2015-04-20
@dizballanze

You can set the parameter CELERYD_MAX_TASKS_PER_CHILDso that the worker process is recreated once every certain number of completed tasks, in which case the memory will be cleared. But you clearly have memory leaks somewhere in your code, otherwise the workers would not eat up so much memory over time.

S
sim3x, 2015-04-21
@sim3x

www.stackoverflow.com/a/22844509/1346222
or
stackoverflow.com/a/17561747/1346222
https://github.com/celery/celery/issues/1427
or you can keel on krone
The real solution is to set limits via lxc
https://blog.codeship.com/lxc-memory -limit/

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question