django-celery

How can I minimise connections with django-celery when using CloudAMQP through dotcloud?

别来无恙 提交于 2019-12-22 09:58:18
问题 After spending a few weeks getting django-celery-rabbitmq working on dotcloud I have discovered that dotcloud is no longer supporting rabbitmq. Instead they recommend CloudAMQP. So I've set up CloudAMQP as per the tutorials: http://docs.dotcloud.com/tutorials/python/django-celery/ http://docs.dotcloud.com/tutorials/more/cloudamqp/ http://www.cloudamqp.com/docs-dotcloud.html And the service works fine. However, even when I do not have any processes running, CloudAMQP says there are 3

Cannot start Celery Worker (Kombu.asynchronous.timer)

与世无争的帅哥 提交于 2019-12-22 05:38:28
问题 I followed the first steps with Celery (Django) and trying to run a heavy process in the background. I have RabbitMQ server installed. However, when I try, celery -A my_app worker -l info it throws the following error File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 665, in _load_unlocked File "

Django, RabbitMQ, & Celery - why does Celery run old versions of my tasks after I update my Django code in development?

≯℡__Kan透↙ 提交于 2019-12-22 04:18:09
问题 So I have a Django app that occasionally sends a task to Celery for asynchronous execution. I've found that as I work on my code in development, the Django development server knows how to automatically detect when code has changed and then restart the server so I can see my changes. However, the RabbitMQ/Celery section of my app doesn't pick up on these sorts of changes in development. If I change code that will later be run in a Celery task, Celery will still keep running the old version of

Celery Task the difference between these two tasks below

狂风中的少年 提交于 2019-12-21 19:52:51
问题 What's the difference between these two tasks below? The first one gives an error, the second one runs just fine. Both are the same, they accept extra arguments and they are both called in the same way. ProcessRequests.delay(batch) **error object.__new__() takes no parameters** SendMessage.delay(message.pk, self.pk) **works!!!!** Now, I have been made aware of what the error means, but my confusion is why one works and not the other. Tasks... 1) class ProcessRequests(Task): name = "Request to

How to route tasks to different queues with Celery and Django

Deadly 提交于 2019-12-21 16:57:53
问题 I am using the following stack: Python 3.6 Celery v4.2.1 (Broker: RabbitMQ v3.6.0 ) Django v2.0.4 . According Celery's documentation, running scheduled tasks on different queues should be as easy as defining the corresponding queues for the tasks on CELERY_ROUTES , nonetheless all tasks seem to be executed on Celery's default queue. This is the configuration on my_app/settings.py : CELERY_BROKER_URL = "amqp://guest:guest@localhost:5672//" CELERY_ROUTES = { 'app1.tasks.*': {'queue': 'queue1'},

Running multiple Django Celery websites on same server

十年热恋 提交于 2019-12-21 04:51:25
问题 I'm running multiple Django/apache/wsgi websites on the same server using apache2 virtual servers. And I would like to use celery, but if I start celeryd for multiple websites, all the websites will use the configuration (logs, DB, etc) of the last celeryd instance I started. Is there a way to use multiple Celeryd (one for each website) or one Celeryd for all of them? Seems like it should be doable, but I can't find out how. 回答1: This problem was a big headache, i didn't noticed @Crazyshezy

Running multiple Django Celery websites on same server

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-21 04:51:23
问题 I'm running multiple Django/apache/wsgi websites on the same server using apache2 virtual servers. And I would like to use celery, but if I start celeryd for multiple websites, all the websites will use the configuration (logs, DB, etc) of the last celeryd instance I started. Is there a way to use multiple Celeryd (one for each website) or one Celeryd for all of them? Seems like it should be doable, but I can't find out how. 回答1: This problem was a big headache, i didn't noticed @Crazyshezy

celery task clean-up with DB backend

懵懂的女人 提交于 2019-12-21 03:57:07
问题 I'm trying to understand how and when tasks are cleaned up in celery. From looking at the task docs I see that: Old results will be cleaned automatically, based on the CELERY_TASK_RESULT_EXPIRES setting. By default this is set to expire after 1 day: if you have a very busy cluster you should lower this value. But this quote is from the RabbitMQ Result Backend section and I do not see any similar text in the Database Backend section. So my question is: is there a backend agnostic approach I

How do I add authentication and endpoint to Django Celery Flower Monitoring?

瘦欲@ 提交于 2019-12-21 03:34:10
问题 I've been using flower locally and it seems easy enough to setup and run, but I can't see how I would set it up in a production environment. In particular, how can I add authentication and how would I define a url to access it? 回答1: For custom address, use the --address flag. For auth, use the --basic_auth flag. See below: # celery flower --help Usage: /usr/local/bin/celery [OPTIONS] Options: --address run on the given address --auth regexp of emails to grant access --basic_auth colon

How to programmatically generate celerybeat entries with celery and Django

喜夏-厌秋 提交于 2019-12-20 09:37:49
问题 I am hoping to be able to programmatically generate celerybeat entries and resync celerybeat when entries are added. The docs here state By default the entries are taken from the CELERYBEAT_SCHEDULE setting, but custom stores can also be used, like storing the entries in an SQL database. So I am trying to figure out which classes i need to extend to be able to do this. I have been looking at celery scheduler docs and djcelery api docs but the documentation on what some of these methods do is