I need to update the solr index on a schedule with the command:
(env)$ ./manage.py update_index
I've looked through the Celery docs and found info on scheduling, but haven't been able to find a way to run a django management command on a schedule and inside a virtualenv. Would this be better run on a normal cron? And if so how would I run it inside the virtualenv? Anyone have experience with this?
Thanks for the help!
To run your command periodically from a cron job, just wrap the command in a bash script that loads the virtualenv. For example, here is what we do to run manage.py commands:
django_cmd.sh:
#!/bin/bash
cd /var/www/website/
source venv/bin/activate
/var/www/website/manage.py $1 --settings=$2
Crontab:
MAILTO=webmaster@website.com
SETTINGSMODULE=website.settings_prod
5 * * * * /var/www/website/django_cmd.sh update_index $SETTINGSMODULE >> /dev/null
0 10 * * * /var/www/website/django_cmd.sh update_accounts $SETTINGSMODULE
Django Celery Task Scheduling project structure
[appname]/
├── [appname]/
│ ├── __init__.py
│ ├── settings.py
│ ├── urls.py
│ ├── celery.py
│ └── wsgi.py
├── [project1]/
│ ├── __init__.py
│ ├── tasks.py
│
└── manage.py
add below configuration in settings.py file:
STATIC_URL = '/static/'
BROKER_URL = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_BACKEND = 'redis'
from celery.schedules import crontab
CELERY_TIMEZONE = 'UTC'
celery.py : holds celery task scheduler
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
import django
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'appname.settings')
from django.conf import settings
app = Celery('appname')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
**#scheduler**
app.conf.beat_schedule = {
'add-every-30-seconds': {
'task': 'project1.tasks.cleanup',
'schedule': 30.0,
'args': ()
},
}
app.conf.timezone = 'UTC'
init.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
tasks.py from project1
from celery import shared_task
import celery
import time
from django.core import management
@celery.task#(name='cleanup')
def cleanup():
try:
print ("in celery module")
"""Cleanup expired sessions by using Django management command."""
management.call_command("clearsessions", verbosity=0)
#PUT MANAGEMENT COMMAND HERE
return "success"
except:
print(e)
Task will run after every 30 seconds
Requirement fro windows:
- redis server should be running
celery worker and celery beat should be running run each below command on different terminal
celery -A appname worker -l info
celery -A appname beat -l info
Requirement fro Linux:
- redis server should be running
celery worker and celery beat should be running celery beat and worker can be started on same server
celery -A appname worker -l info -B
@tzenderman please let me know if I missed something. For me this is working fine
I actually found a nice way of doing this using fabric + celery and I'm working on it now:
In app/tasks.py, create a fabric function with the manage.py commands you need, then decorate it with @periodic_task
, add it to your celery schedule and it should be good to go.
UPDATE: I wasn't able to actually use Fabric + Celery because using fabric in the module caused it be recognized as a fabric file and the celery calls in the file didn't work.
来源:https://stackoverflow.com/questions/17664166/django-celery-scheduling-a-manage-py-command