How to Inspect the Queue Processing a Celery Task

自古美人都是妖i 提交于 2020-06-27 16:56:18

问题


I'm currently leveraging celery for periodic tasks. I am new to celery. I have two workers running two different queues. One for slow background jobs and one for jobs user's queue up in the application.

I am monitoring my tasks on datadog because it's an easy way to confirm my workers a running appropriately.

What I want to do is after each task completes, record which queue the task was completed on.

@after_task_publish.connect()
def on_task_publish(sender=None, headers=None, body=None, **kwargs):
    statsd.increment("celery.on_task_publish.start.increment")

    task = celery.tasks.get(sender)
    queue_name = task.queue

    statsd.increment("celery.on_task_publish.increment", tags=[f"{queue_name}:{task}"])

The following function is something that I implemented after researching the celery docs and some StackOverflow posts, but it's not working as intended. I get the first statsd increment but the remaining code does not execute.

I am wondering if there is a simpler way to inspect inside/after each task completes, what queue processed the task.


回答1:


Since your question says is there a way to inspect inside/after each task completes - I'm assuming you haven't tried this celery-result-backend stuff. So you could check out this feature which is provided by Celery itself : Celery-Result-Backend / Task-result-Backend . It is very useful for storing results of your celery tasks. Read through this => https://docs.celeryproject.org/en/stable/userguide/configuration.html#task-result-backend-settings


Once you get an idea of how to setup this result-backend, Search for result_extended key (in the same link) to be able to add queue-names in your task return values.

Number of options are available - Like you can setup these results to go to any of these :

Sql-DB / NoSql-DB / S3 / Azure / Elasticsearch / etc 

I have made use of this Result-Backend feature with Elasticsearch and this how my task results are stored :

It is just a matter of adding few configurations in settings.py file as per your requirements. Worked really well for my application. And I have a weekly cron that clears only successful results of tasks - since we don't need the results anymore - and I can see only failed results (like the one in image).

These were main keys for my requirement : task_track_started and task_acks_late along with result_backend



来源:https://stackoverflow.com/questions/62056153/how-to-inspect-the-queue-processing-a-celery-task

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!