celery

Read the Docs local install: Celery ValueError: signal only works in main thread

 ̄綄美尐妖づ 提交于 2021-01-28 05:25:45
问题 I have a local readthedocs install and get a ValueError exception when trying to import a project. I'm on release 5.1.0, running python 3.6 on Debian buster with celery 4.1.1 (from the requirements files). From the debug.log: [19/May/2020 23:31:11] celery.app.trace:124[24]: INFO Task readthedocs.projects.tasks.send_notifications[39551573-cfe1-46c1-b7e2-28bde20fd962] succeeded in 0.005342413205653429s: None [19/May/2020 23:31:11] celery.app.trace:124[24]: INFO Task readthedocs.oauth.tasks

celery: get function name by task id?

为君一笑 提交于 2021-01-28 04:49:20
问题 I am using celery on_failure handler to logging all failed tasks for debugging and analysis. And I want to know the task name(function name) of the failed task, how can I get that? from celery import Task class DebugTask(Task): abstract = True def after_return(self, *args, **kwargs): print('Task returned: {0!r}'.format(self.request)) def on_failure(self, exc, task_id, args, kwargs, einfo): func_name = get_func_name_by_task_id(task_id) # how do I do this? print "{} failed".format(func_name) #

How to test if email was sent after executing celery task

自闭症网瘾萝莉.ら 提交于 2021-01-28 03:51:34
问题 I'm using Django 1.10 and Celery 4.1 I have a shared_task which sends an email to the user. # myapp/tasks.py @shared_task def notify_user(user_id): # TODO: send email and do other stuff here user = get_object_or_404(User, pk=user_id) send_mail( 'Subject', 'Body', 'from@example.com', [user.email], ) I have another file which contains a function that calls puts that tasks into the queue. # myapp/utils.py # ... def update_queue(self): # increment no_of_used_referrals by 1 if no_of_used_referrals

How to test if email was sent after executing celery task

匆匆过客 提交于 2021-01-28 02:48:22
问题 I'm using Django 1.10 and Celery 4.1 I have a shared_task which sends an email to the user. # myapp/tasks.py @shared_task def notify_user(user_id): # TODO: send email and do other stuff here user = get_object_or_404(User, pk=user_id) send_mail( 'Subject', 'Body', 'from@example.com', [user.email], ) I have another file which contains a function that calls puts that tasks into the queue. # myapp/utils.py # ... def update_queue(self): # increment no_of_used_referrals by 1 if no_of_used_referrals

Pass parameters to Celery on_error task

时光总嘲笑我的痴心妄想 提交于 2021-01-27 22:01:15
问题 I'm using celery with rabbitMQ as a broker to manage asynchronous tasks. I'm chording some tasks together and when each of them has finished, I'm running a concatenation task that will join each result. These collection of tasks is identified by a report_id set by me. chord([task1.s(report_id=report_id), task2.s(report_id=report_id)...]) \ (concat_task.s(report_id=report_id).set(queue='default') \ .on_error(on_chord_error.s().set(queue='default'))) And this is my custom on_error task: @celery

Create a class that support json serialization for use with Celery

被刻印的时光 ゝ 提交于 2021-01-27 13:29:02
问题 I'm using Celery to run some background tasks. One of the tasks returns a python class I created. I want to use json to serialize and deserialize this class, given the warnings about using pickle. Is there a simple built in way to achieve this? The class is very simple, it contains 3 attributes all of which are lists of named tuples. It contains a couple of methods that performs some calculations on the attributes. My idea is to serialize/deserialize the 3 attributes, since that defines the

Make a Celery task that waits for a signal?

强颜欢笑 提交于 2021-01-27 12:55:16
问题 Is it possible to create Celery task that just waits for a signal? I have this scenario: Scrapyd in one virtualenv on remote machine A Django project with Celery worker node in another virtualenv on remote machine A The same Django project with Celery, but in another virtualenv on local machine B How I use this setup: I would send a task chain from Django on machine B Let the task chain be consumed by the worker node on machine A. In the first subtask of the task chain, I would schedule a

Celery task group not being executed in background and results in exception

青春壹個敷衍的年華 提交于 2021-01-27 12:40:19
问题 My Celery task isn't executing in the background in my Django 1.7/Python3 project. # settings.py BROKER_URL = 'redis://localhost:6379/0' CELERY_RESULTBACKEND = BROKER_URL CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' CELERY_ALWAYS_EAGER = False I have celery.py in my root app module as such: from __future__ import absolute_import import os import django from celery import Celery from django.conf import settings os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'my_app.settings'

memcache on django is not working

偶尔善良 提交于 2021-01-27 06:07:53
问题 I have a race condition in Celery . Inspired by this - http://ask.github.io/celery/cookbook/tasks.html#ensuring-a-task-is-only-executed-one-at-a-time I decided to use memcache to add locks to my tasks. These are the changes I made: python-memcached # settings for memcache CACHES = { 'default': { 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache', 'LOCATION': '127.0.0.1:11211', } } After this I login to my shell and do the following >>> import os >>> import django >>> from django

memcache on django is not working

ⅰ亾dé卋堺 提交于 2021-01-27 06:01:54
问题 I have a race condition in Celery . Inspired by this - http://ask.github.io/celery/cookbook/tasks.html#ensuring-a-task-is-only-executed-one-at-a-time I decided to use memcache to add locks to my tasks. These are the changes I made: python-memcached # settings for memcache CACHES = { 'default': { 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache', 'LOCATION': '127.0.0.1:11211', } } After this I login to my shell and do the following >>> import os >>> import django >>> from django