Celery does not release memory

后端 未结 6 1895
小鲜肉
小鲜肉 2020-12-31 07:25

It looks like celery does not release memory after task finished. Every time a task finishes, there would be 5m-10m memory leak. So with thousands of tasks, soon it will use

相关标签:
6条回答
  • 2020-12-31 07:56

    You might be hitting this issue in librabbitmq. Please check whether or not Celery is using librabbitmq>=1.0.1.

    A simple fix to try is: pip install librabbitmq>=1.0.1.

    0 讨论(0)
  • 2020-12-31 07:58

    It was this config option that made my worker does not release memory.

    CELERYD_TASK_TIME_LIMIT = 600
    

    refer to: https://github.com/celery/celery/issues/1427

    0 讨论(0)
  • 2020-12-31 08:00

    When you start your worker just set the max-tasks-per-child option like this to restart worker processes after every task:

    celery -A app worker --loglevel=info --max-tasks-per-child=1

    Here's the documentation:

    http://docs.celeryproject.org/en/latest/userguide/workers.html#max-memory-per-child-setting

    0 讨论(0)
  • 2020-12-31 08:17

    This was an issue in celery which I think is fixed.

    Please refer: https://github.com/celery/celery/issues/2927

    0 讨论(0)
  • 2020-12-31 08:20

    There are two settings which can help you mitigate growing memory consumption of celery workers:

    • Max tasks per child setting (v2.0+):

      With this option you can configure the maximum number of tasks a worker can execute before it’s replaced by a new process. This is useful if you have memory leaks you have no control over for example from closed source C extensions.

    • Max memory per child setting (v4.0+):

      With this option you can configure the maximum amount of resident memory a worker can execute before it’s replaced by a new process. This is useful if you have memory leaks you have no control over for example from closed source C extensions.

    0 讨论(0)
  • 2020-12-31 08:23

    set worker_max_tasks_per_child in your settings

    0 讨论(0)
提交回复
热议问题