Add n tasks to celery queue and wait for the results

后端 未结 3 1882
伪装坚强ぢ
伪装坚强ぢ 2020-12-05 10:21

I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, e

相关标签:
3条回答
  • 2020-12-05 10:43

    Task.delay returns AsyncResult. Use AsyncResult.get to get result of each task.

    To do that you need to keep references to the tasks.

    def do_tasks(b):
        tasks = []
        for a in b:
            tasks.append(c.delay(a))
        return [t.get() for t in tasks]
    

    Or you can use ResultSet:

    UPDATE: ResultSet is deprecated, please see @laffuste 's answer.

    def do_tasks(b):
        rs = ResultSet([])
        for a in b:
            rs.add(c.delay(a))
        return rs.get()
    
    0 讨论(0)
  • 2020-12-05 11:00

    I have a hunch you are not really wanting the delay but the async feature of Celery.

    I think you really want a TaskSet:

    from celery.task.sets import TaskSet
    from someapp.tasks import sometask
    
    def do_tasks(b):
        job = TaskSet([sometask.subtask((a,)) for a in b])
        result = job.apply_async()
        # might want to handle result.successful() == False
        return result.join()
    
    0 讨论(0)
  • 2020-12-05 11:08

    For Celery >= 3.0, TaskSet is deprecated in favour of group.

    from celery import group
    from tasks import add
    
    job = group([
                 add.s(2, 2),
                 add.s(4, 4),
                 add.s(8, 8),
                 add.s(16, 16),
                 add.s(32, 32),
    ])
    

    Start the group in the background:

    result = job.apply_async()
    

    Wait:

    result.join()
    
    0 讨论(0)
提交回复
热议问题