I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, e
Task.delay returns AsyncResult. Use AsyncResult.get to get result of each task.
To do that you need to keep references to the tasks.
def do_tasks(b):
tasks = []
for a in b:
tasks.append(c.delay(a))
return [t.get() for t in tasks]
Or you can use ResultSet:
UPDATE: ResultSet
is deprecated, please see @laffuste 's answer.
def do_tasks(b):
rs = ResultSet([])
for a in b:
rs.add(c.delay(a))
return rs.get()
I have a hunch you are not really wanting the delay but the async feature of Celery.
I think you really want a TaskSet:
from celery.task.sets import TaskSet
from someapp.tasks import sometask
def do_tasks(b):
job = TaskSet([sometask.subtask((a,)) for a in b])
result = job.apply_async()
# might want to handle result.successful() == False
return result.join()
For Celery >= 3.0, TaskSet is deprecated in favour of group.
from celery import group
from tasks import add
job = group([
add.s(2, 2),
add.s(4, 4),
add.s(8, 8),
add.s(16, 16),
add.s(32, 32),
])
Start the group in the background:
result = job.apply_async()
Wait:
result.join()