Celery: Worker with concurrency and reserved tasks only running 1 task

后端 未结 2 1686
予麋鹿
予麋鹿 2021-02-02 14:07

Some of the tasks in my code were taking longer and longer to execute.

Upon inspection I noticed that although I have my worker node set to concurrency 6, and 6 processe

相关标签:
2条回答
  • 2021-02-02 14:58

    The docs mention here how to reserve one task at a time - or only as many as you have concurrency:

    Often users ask if disabling “prefetching of tasks” is possible, but what they really mean by that, is to have a worker only reserve as many tasks as there are worker processes (10 unacknowledged tasks for -c 10)

    That’s possible, but not without also enabling late acknowledgment. Using this option over the default behavior means a task that’s already started executing will be retried in the event of a power failure or the worker instance being killed abruptly, so this also means the task must be idempotent ... You can enable this behavior by using the following configuration options:

    task_acks_late = True
    worker_prefetch_multiplier = 1
    

    or the code equivalent:

    app = Celery(...)
    app.conf.worker_prefetch_multiplier = 1
    app.conf.task_acks_late = True
    ...
    
    0 讨论(0)
  • 2021-02-02 14:59

    I'm not sure if it's your use case, but I ran into similar problems when I had a mix of long and short tasks. Basically what happened is that at some point a process could start a very long running task, while prefetching a few other tasks, preventing them from being consumed by other processes. So I disabled the prefetching stuff, which is useful only if you're running a lot of short tasks.

    To disable the prefetch, you need Celery 3.1+ and the Ofair setting, for instance:

    celery -A proj worker -l info -Ofair
    
    0 讨论(0)
提交回复
热议问题