django-celery

In celery, what is the appropriate way to pass contextual metadata from sender process to worker when a task is enqueued?

六月ゝ 毕业季﹏ 提交于 2020-01-24 09:40:14
问题 When any celery task is enqueued I want to add contextual metadata the worker will be able to use. The following code example works but I would like to have an appropriate celery-style solution. from celery.signals import before_task_publish, task_prerun @before_task_publish.connect def receiver_before_task_publish(sender=None, headers=None, body=None, **kwargs): task_kwags = body[1] metadata = {"foo": "bar"} task_kwags['__metadata__'] = metadata @task_prerun.connect def receiver_task_pre_run

Celery auto reload on ANY changes

六眼飞鱼酱① 提交于 2020-01-22 13:21:10
问题 I could make celery reload itself automatically when there is changes on modules in CELERY_IMPORTS in settings.py . I tried to give mother modules to detect changes even on child modules but it did not detect changes in child modules. That make me understand that detecting is not done recursively by celery. I searched it in the documentation but I did not meet any response for my problem. It is really bothering me to add everything related celery part of my project to CELERY_IMPORTS to detect

Celery auto reload on ANY changes

好久不见. 提交于 2020-01-22 13:19:05
问题 I could make celery reload itself automatically when there is changes on modules in CELERY_IMPORTS in settings.py . I tried to give mother modules to detect changes even on child modules but it did not detect changes in child modules. That make me understand that detecting is not done recursively by celery. I searched it in the documentation but I did not meet any response for my problem. It is really bothering me to add everything related celery part of my project to CELERY_IMPORTS to detect

Getting parameters of a failed django-celery task

戏子无情 提交于 2020-01-14 14:36:29
问题 Is it possible to get the arguments used to call a particular failed celery task given the task's ID? I am using MongoDB as the broker and using the django-celery package. I know that you can get the result pretty easily but wanted to know if you can do the same with the arguments used to call that task. Thanks 回答1: I managed to solve this problem by implementing a custom on_failure handler for my task as specified here: http://docs.celeryproject.org/en/latest/userguide/tasks.html#handlers I

Getting parameters of a failed django-celery task

扶醉桌前 提交于 2020-01-14 14:35:06
问题 Is it possible to get the arguments used to call a particular failed celery task given the task's ID? I am using MongoDB as the broker and using the django-celery package. I know that you can get the result pretty easily but wanted to know if you can do the same with the arguments used to call that task. Thanks 回答1: I managed to solve this problem by implementing a custom on_failure handler for my task as specified here: http://docs.celeryproject.org/en/latest/userguide/tasks.html#handlers I

Django: djcelery Import error from celery import current_app as celery in virtualenv

青春壹個敷衍的年華 提交于 2020-01-14 14:12:05
问题 Okay so I have tried everything I and google can come up. I'm trying to run django-celery under a virtualenv on my Macbook Pro OSX 10.8.4. I installed django-celery using pip while the virtualenv was activated. I get the following when importing djcelery in virtualenv python. (platform)Chriss-MacBook-Pro:platform Chris$ python Python 2.7.2 (default, Oct 11 2012, 20:14:37) [GCC 4.2.1 Compatible Apple Clang 4.0 (tags/Apple/clang-418.0.60)] on darwin Type "help", "copyright", "credits" or

Celery task with multiple decorators not auto registering task name

给你一囗甜甜゛ 提交于 2020-01-14 07:29:12
问题 I'm having a task that looks like this from mybasetask_module import MyBaseTask @task(base=MyBaseTask) @my_custom_decorator def my_task(*args, **kwargs): pass and my base task looks like this from celery import task, Task class MyBaseTask(Task): abstract = True default_retry_delay = 10 max_retries = 3 acks_late = True The problem I'm running into is that the celery worker is registering the task with the name 'mybasetask_module.__inner' The task is registerd fine (which is the package+module

Task state and django-celery

假如想象 提交于 2020-01-14 07:16:05
问题 I use django-celery and have task like this: class TestTask(Task): name = "enabler.test_task" def run(self, **kw): debug_log("begin test task") time.sleep(5) debug_log("end test task") def on_success(self, retval, task_id, args, kwargs): debug_log("on success") def on_failure(self, retval, task_id, args, kwargs): debug_log("on failure") I use django shell to run task: python manage.py shell r = tasks.TestTask().delay() From celery log I see that task is executed: [2012-01-16 08:13:29,362:

Huge delay when using Celery + Redis

匆匆过客 提交于 2020-01-07 04:37:06
问题 I'm testing Django + Celery, hello world examples. With RabbitMQ celery works fine, but when I switched to Redis broker/result I get following: %timeit add.delay(1,2).get() 1 loops, best of 3: 503 ms per loop settings.py CELERY_RESULT_BACKEND = "redis" BROKER_URL = 'redis://localhost:6379' tasks.py @task() def add(x, y): return x + y Is there any issues in test above? 回答1: I found solution is source code: http://docs.celeryproject.org/en/latest/_modules/celery/result.html#AsyncResult.get

Class based Task in django celery

≯℡__Kan透↙ 提交于 2020-01-07 02:56:12
问题 I am using this class to push notification via celery. The class definition goes like this. from celery import Task class NotifyUser(Task): """ send push notification, entry point for this purpose is def send_notification() """ def __init__(self, users=None, notification_code=None, context=None, merchant_flag=False, listing_flag=False, object_type=None, object_id=None, category_id=None, *args, **kwargs): Task.__init__(self, *args, **kwargs) self.notification_code = notification_code self