When I route a task to a particular queue it works:
task.apply_async(queue=\'beetroot\')
But if I create a chain:
chain = t
I do it like this:
subtask = task.s(*myargs, **mykwargs).set(queue=myqueue)
mychain = celery.chain(subtask, subtask2, ...)
mychain.apply_async()
Ok I got this one figured out.
You have to add the required execution options like queue= or countdown= to the subtask definition, or through a partial:
subtask definition:
from celery import subtask
chain = subtask('task', queue = 'beetroot') | subtask('task', queue = 'beetroot')
partial:
chain = task.s().apply_async(queue = 'beetroot') | task.s().apply_async(queue = 'beetroot')
Then you execute the chain through:
chain.apply_async()
or,
chain.delay()
And the tasks will be sent to the 'beetroot' queue. Extra execution arguments in this last command will not do anything. It would have been kind of nice to apply all of those execution arguments at the Chain (or Group, or any other Canvas primitives) level.
This is rather late, but I don't think the code provided by @mpaf is entirely correct.
Context: In my case, I have two subtasks, out of which the first provides a return value which is passed on to the second as the input argument. I was having trouble in getting the second task to execute - I saw in the logs that Celery would acknowledge the second task as a callback of the first, but it would never execute the second.
This was my non-working chain code -:
from celery import chain
chain(
module.task1.s(arg),
module.task2.s()
).apply_async(countdown=0.1, queue='queuename')
Using the syntax provided in @mpaf's answer, I got both tasks to execute, but the execution order was haphazard and the second subtask was not acknowledged as a callback of the first. I got the idea to browse the docs on how to explicitly set a queue on a subtask.
This is the working code -:
chain(
module.task1.s(arg).set(queue='queuename'),
module.task2.s().set(queue='queuename')
).apply_async(countdown=0.1)