Python - Retry a failed Celery task from another queue

后端 未结 1 1678
盖世英雄少女心
盖世英雄少女心 2021-01-06 15:54

I\'m posting a data to a web-service in Celery. Sometimes, the data is not posted to web-service because of the internet is down, and the task is retried infinite times unt

相关标签:
1条回答
  • 2021-01-06 16:39

    By default celery adds all tasks to queue named celery. So you can run your task here and when an exception occurs, it retries, once it reaches maximum retries, you can shift them to a new queue say foo

    from celery.exceptions import MaxRetriesExceededError
    
    @shared_task(default_retry_delay = 1 * 60, max_retries = 10)
    def post_data_to_web_service(data,url):
        try:
            #do something with given args
    
     except MaxRetriesExceededError:
            post_data_to_web_service([data, url], queue='foo')
    
     except Exception, exc:
            raise post_data_to_web_service.retry(exc=exc) 
    

    When you start your worker, this task will try to do something with given data. If it fails it will retry 10 times with a dealy of 60 seconds. Then when it encounters MaxRetriesExceededError it posts the same task to new queue foo.

    To consume these tasks you have to start a new worker

    celery worker -l info -A my_app -Q foo
    

    or you can also consume this task from the default worker if you start it with

     celery worker -l info -A my_app -Q celery,foo
    
    0 讨论(0)
提交回复
热议问题