django-celery

Question: Usage of django celery.backend_cleanup

跟風遠走 提交于 2019-12-11 15:37:19
问题 There is not much documentation available for the actual usage of django celery.backend_cleanup Let's assume i have following 4 tasks scheduled with different interval Checking DatabaseScheduler Logs I had found that only Task1 is executing on interval. [2018-12-28 11:21:08,241: INFO/MainProcess] Writing entries... [2018-12-28 11:24:08,778: INFO/MainProcess] Writing entries... [2018-12-28 11:27:09,315: INFO/MainProcess] Writing entries... [2018-12-28 11:28:32,948: INFO/MainProcess] Scheduler:

How to setup celery worker to log all task function calls to one file

那年仲夏 提交于 2019-12-11 15:20:56
问题 I have Django application with such logging configuration. LOGGING = { 'version': 1, 'disable_existing_loggers': False, 'formatters': { 'default': { 'format': '%(asctime)s [%(levelname)s] %(filename)s:%(lineno)s: %(message)s' }, }, 'handlers': { 'cron': { 'class': 'logging.FileHandler', 'filename': 'cron.log', 'formatter': 'default', }, 'admin': { 'class': 'logging.FileHandler', 'filename': 'admin.log', 'formatter': 'default', }, 'app': { 'class': 'logging.FileHandler', 'filename': 'app.log',

Pass parameter request to a celery task in Django

最后都变了- 提交于 2019-12-11 14:47:31
问题 I have a simple issue, we have as follows: @task() def upload_image(request): var = request.POST # ... do something And we call it in another method the delay for this method like this: job = upload_image.delay(request) This not works obviously, after I read, you can pass messages to a celery task like a simple arg , args or kwargs** but what I just want is to pass a simple object, not a string or list of strings, is there anyway to do this in celery? Regards! 回答1: As you can read from the

Django celery not finding celery module

别来无恙 提交于 2019-12-11 13:25:03
问题 I'm trying to set up celery with Django from the tuts but I keep getting ModuleNotFoundError: No module named 'celery' I have a main project called Tasklist with the structure: - Tasklist/ - manage.py - Tasklist/ - __init__.py - settings.py - celery.py - urls.py My init .py is as follows: from __future__ import absolute_import, unicode_literals from .celery import app as celery_app __all__ = ['celery_app'] And my celery.py is like so: from __future__ import absolute_import, unicode_literals

Celery log shows cleanup failed

做~自己de王妃 提交于 2019-12-11 11:34:02
问题 I am using celery with django. I see an error when I lookup the celery log for the automatically scheduled cleanup. I am not sure what this means, and the implications of not doing the cleanup. Any help is appreciated. [2013-09-28 23:00:00,204: ERROR/MainProcess] Task celery.backend_cleanup[65af1634-374a-4068-b1a5-749b70f7c78d] raised exception: NotImplementedError('No updates',) Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/celery-3.0.15-py2.7.egg/celery

Django: updating many objects with per-object calculation

妖精的绣舞 提交于 2019-12-11 11:20:15
问题 This question is a continuation of one I asked yesterday: I'm still not sure if a post_save handler or a 2nd Celery task is the best way to update many objects based on the results of the first Celery task, but I plan to test performance down the line. Here's a recap of what's happening: Celery task, every 30s: Update page_count field of Book object based on conditions | post_save(Book) | V Update some field on all Reader objects w/ foreign key to updated Book (update will have different

ImportError: cannot import custom module to celery tasks, how to improve?

为君一笑 提交于 2019-12-11 09:14:22
问题 I need to import a model from my application, then make a request, and send a sms, but I can not import my model, although the name is specified correctly, who can help ask, I will wait, thank you all! Full traceback > Traceback (most recent call last): File > "c:\users\p.a.n.d.e.m.i.c\appdata\local\programs\python\python36-32\Lib\runpy.py", > line 193, in _run_module_as_main > "__main__", mod_spec) File "c:\users\p.a.n.d.e.m.i.c\appdata\local\programs\python\python36-32\Lib\runpy.py", > line

'./manage.py runserver' restarts when celery map/reduce tasks are running; sometimes raises error with inner_run

扶醉桌前 提交于 2019-12-11 08:59:53
问题 I have a view in my django project that fires off a celery task. The celery task itself triggers a few map/reduce jobs via subprocess/fabric and the results of the hadoop job are stored on disk --- nothing is actually stored in the database. After the hadoop job has been completed, the celery task sends a django signal that it is done, something like this: # tasks.py from models import MyModel import signals from fabric.operations import local from celery.task import Task class

Can I review and delete Celery / RabbitMQ tasks individually?

北城余情 提交于 2019-12-11 05:11:54
问题 I am running Django + Celery + RabbitMQ. After modifying some task names I started getting "unregistered task" KeyErrors, even after removing tasks with this key from the Periodic tasks table in Django Celery Beat and restarting the Celery worker. It turns out Celery / RabbitMQ tasks are persistent. I eventually resolved the issue by reimplementing the legacy tasks as dummy methods. In future, I'd prefer not to purge the queue, restart the worker or reimplement legacy methods. Instead I'd

How to daemonize django celery periodic task on ubuntu server?

拜拜、爱过 提交于 2019-12-11 05:05:50
问题 On localhost, i used these statements to execute tasks and workers. Run tasks: python manage.py celery beat Run workers: python manage.py celery worker --loglevel=info I used otp, rabbitmq server and django-celery. It is working fine. I uploaded the project on ubuntu server. I would like to daemonize these. For that i created a file /etc/default/celeryd as below config settings. # Name of nodes to start, here we have a single node CELERYD_NODES="w1" # or we could have three nodes: #CELERYD