Celery Async Tasks and Periodic Tasks together

你离开我真会死。 提交于 2020-01-05 07:01:08

问题


Unable to run periodic tasks along with asynchronous tasks together. Although, if I comment out the periodic task, asynchronous tasks are executed fine, else asynchronous tasks are stuck.

Running: celery==4.0.2, Django==2.0, django-celery-beat==1.1.0, django-celery-results==1.0.1

Referred: https://github.com/celery/celery/issues/4184 to choose celery==4.0.2 version, as it seems to work.

Seems to be a known issue

https://github.com/celery/django-celery-beat/issues/27

I've also done some digging the ONLY way I've found to get it back to normal is to remove all periodic tasks and restart celery beat. ~ rh0dium

celery.py

import django
import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'bid.settings')

# Setup django project
django.setup()

app = Celery('bid')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

settings.py

INSTALLED_APPS = (
         ...
         'django_celery_results',
         'django_celery_beat',
     )

# Celery related settings

CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_BROKER_TRANSPORT_OPTIONS = {'visibility_timeout': 43200, }
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_CONTENT_ENCODING = 'utf-8'
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_SEND_EVENTS = False
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'

Periodic task

@periodic_task(run_every=crontab(hour=7, minute=30), name="send-vendor-status-everyday")
def send_vendor_status():
    return timezone.now()

Async task

@shared_task
def vendor_creation_email(id):
   return "Email Sent"

Async task caller

vendor_creation_email.apply_async(args=[instance.id, ]) # main thread gets stuck here, if periodic jobs are scheduled.

Running the worker, with beat as follows

celery worker -A bid -l debug -B

Please help.


回答1:


Here are a few observations, resulted from multiple trial and errors, and diving into celery's source code.

  1. @periodic_task is deprecated. Hence it would not work.

from their source code:

#venv36/lib/python3.6/site-packages/celery/task/base.py
def periodic_task(*args, **options):
    """Deprecated decorator, please use :setting:`beat_schedule`."""
    return task(**dict({'base': PeriodicTask}, **options))
  1. Use UTC as base timezone, to avoid timezone related confusions later on. Configure periodic task to fire on calculated times with respect to UTC. e.g. for 'Asia/Calcutta' reduce the time by 5hours 30mins.

  2. Create a celery.py as follows:

celery.py

import django
import os 

from celery import Celery
# set the default Django settings module for the 'celery' program.
from celery.schedules import crontab

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
# Setup django project
django.setup()

app = Celery('proj')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
app.conf.beat_schedule = {
    'test_task': {
        'task': 'test_task',
        'schedule': crontab(hour=2,minute=0),
    }
}

and task could be in tasks.py under any app, as follows

@shared_task(name="test_task")
def test_add():
    print("Testing beat service")

Use celery worker -A proj -l info and celery beat -A proj -l info for worker and beat, along with a broker e.g. redis. and this setup should work fine.



来源:https://stackoverflow.com/questions/48640549/celery-async-tasks-and-periodic-tasks-together

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!