django-celery

Using django-nose and django-celery together — unit testing

风流意气都作罢 提交于 2019-12-13 12:56:50
问题 I have a django project that used django-nose. I'd like to add django-celery to the project. I use unit tests. Both django-nose and django-celery need a TEST_RUNNER setting in my settings.py file. Specifically: TEST_RUNNER = 'django_nose.NoseTestSuiteRunner' for django-nose and: TEST_RUNNER = 'djcelery.contrib.test_runner.CeleryTestSuiteRunner' for django-celery. How should I handle this so that I can use both packages? 回答1: I found that the best way to handle this is to skip the Celery test

Celery worker getting crashed on Heroku

放肆的年华 提交于 2019-12-13 03:25:20
问题 I am working on a Django project which I have pushed on Heroku , for background tasking I have used Celery . Although Celery works fine locally, but on the Heroku server I have observed that celery worker is getting crashed. I have set CLOUDAMQP_URL properly in settings.py and configured worker configuration in Procfile , but still worker is getting crashed. Procfile web: gunicorn my_django_app.wsgi --log-file - worker: python manage.py celery worker --loglevel=info Settings.py ... # Celery

Django Celery - Missing something but I have no idea what? Have results but can't get them

一世执手 提交于 2019-12-13 03:18:56
问题 My task goes into celery and gets results. I know this because I can do this. >>> ts = TaskState.objects.all()[0] >>> ts Out[31]: <TaskState: SUCCESS apps.checklist.tasks.bulk_checklist_process(ec01461b-3431-478d-adfc-6d6cf162e9ad) ts:2012-07-20 14:35:41> >>> ts.state Out[32]: u'SUCCESS' >>> ts.result Out[33]: u'{\'info\': ["Great",]}' But when I attempt to use the documented way to get the result - all hell breaks loose.. >>> from celery.result import BaseAsyncResult >>> result =

Django Celery - How to start a task with a delay of n - seconds - countdown flag is ignored

▼魔方 西西 提交于 2019-12-12 14:11:21
问题 In my Django project I'm running some asynchronous tasks using Celery (docs), Django-Celery and RabbitMQ as the broker. Whereas it works in general, I have two problems with my setup: a) the task execution seems to be joined with my request thread. Thus the user http request seems to wait until the task has been executed b) the task execution seems to ignore the countdown flag For testing purposes I have setup a simple TestTask: from celery.task import Task from celery.registry import tasks

Stopping celery task gracefully

一曲冷凌霜 提交于 2019-12-12 09:52:01
问题 I'd like to quit a celery task gracefully (i.e. not by calling revoke(celery_task_id, terminate=True) ). I thought I'd send a message to the task that sets a flag, so that the task function can return. What's the best way to communicate with a task? 回答1: Use signals for this. Celery's revoke is the right choice; it uses SIGTERM by default, but you can specify another using the signal argument, if you prefer. Just set a signal handler for it in your task (using the signal module) that

Celery: auto discovery does not find tasks module in app

岁酱吖の 提交于 2019-12-12 08:23:09
问题 I have the following setup with a fresh installed celery and django 1.4: settings.py: import djcelery djcelery.setup_loader() BROKER_HOST = 'localhost' BROKER_PORT = 5672 BROKER_USER = 'user' BROKER_PASSWORD = 'password' BROKER_VHOST = 'test' [...] INSTALLED_APPS = [ 'django.contrib.auth', 'django.contrib.admin', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.sites', 'django.contrib.staticfiles', 'djcelery', 'south', 'compressor', 'testapp', ] testapp/tasks.py: from

rabbitmq-server fails to start after hostname has changed for first time

妖精的绣舞 提交于 2019-12-12 07:28:54
问题 I am using django-celery for my django project. Last day I have changed my computer's hostname (I am using Ubuntu 12.04, edited file '/etc/hostname'), and after next restart django-celery was failing with error Consumer: Connection Error: [Errno 111] Connection refused. Trying again in 4 seconds... After some research on this error I could find that, changing my host name caused this error from here. My rabbitmq startup log shows file: /var/log/rabbitmq/startup_log Activating RabbitMQ plugins

Django matching query does not exist after object save in Celery task

不问归期 提交于 2019-12-12 07:16:02
问题 I have the following code: @task() def handle_upload(title, temp_file, user_id): . . . photo.save() #if i insert here "photo2 = Photo.objects.get(pk=photo.pk)" it works, including the view function return photo.pk #view function def upload_status(request): task_id = request.POST['task_id'] async_result = AsyncResult(task_id) photo_id = async_result.get() if async_result.successful(): photo = Photo.objects.get(pk=photo_id) I use an ajax request to check for the uploaded file but after the

module can't be installed in Django virtual environment

前提是你 提交于 2019-12-11 21:52:18
问题 I used pip install django-celery and pip3 install django-celery in Pycharm. After that I use import djcelery ,but Pycharm reminds me no module named djcelery . Then I used pip list I can see django-celery 3.2.2 in the list. But when I went to virtual environment path myenv/lib/site-packages where I can see all the module or app that I have installed such as django-pure-pagination.But I can't find django-celery there. Any friend has any idea, how to fix it? 回答1: Seems like you've installed

Why the supervisor make the celery worker changing form running to starting all the time?

删除回忆录丶 提交于 2019-12-11 17:43:52
问题 backgroud The system is Centos7, which have a python2.x. 1GB memory and single core. I install python3.x , I can code python3 into python3. The django-celery project is based on a virtualenv python3.x,and I had make it well at nginx,uwsgi,mariadb. At least,I think so for no error happend. I try to use supervisor to control the django-celery's worker,like below: command=env/bin/python project/manage.py celeryd -l INFO -n worker_%(process_num)s numprocs=4 process_name=projects_worker_%(process