django-celery

Celery with rabbitmq creates results multiple queues

孤者浪人 提交于 2021-02-07 05:47:06
问题 I have installed Celery with RabbitMQ. Problem is that for every result that is returned, Celery will create in the Rabbit, queue with the task's ID in the exchange celeryresults. I still want to have results, but on ONE queue. my celeryconfig: from datetime import timedelta OKER_URL = 'amqp://' CELERY_RESULT_BACKEND = 'amqp' #CELERY_IGNORE_RESULT = True CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' CELERY_ACCEPT_CONTENT=['json', 'application/json'] CELERY_TIMEZONE =

Django Celery Beat admin updating Cron Schedule Periodic task not taking effect

半世苍凉 提交于 2021-02-06 20:51:28
问题 I'm running a site using Django 10, RabbitMQ, and Celery 4 on CentOS 7. My Celery Beat and Celery Worker instances are controlled by supervisor and I'm using the django celery database scheduler. I've scheduled a cron style task using the cronsheduler in Django-admin. When I start celery beat and worker instances the job fires as expected. But if a change the schedule time in Django-admin then the changes are not picked up unless I restart the celery-beat instance. Is there something I am

Django Celery Beat admin updating Cron Schedule Periodic task not taking effect

扶醉桌前 提交于 2021-02-06 20:48:21
问题 I'm running a site using Django 10, RabbitMQ, and Celery 4 on CentOS 7. My Celery Beat and Celery Worker instances are controlled by supervisor and I'm using the django celery database scheduler. I've scheduled a cron style task using the cronsheduler in Django-admin. When I start celery beat and worker instances the job fires as expected. But if a change the schedule time in Django-admin then the changes are not picked up unless I restart the celery-beat instance. Is there something I am

Django Celery Beat admin updating Cron Schedule Periodic task not taking effect

拈花ヽ惹草 提交于 2021-02-06 20:47:03
问题 I'm running a site using Django 10, RabbitMQ, and Celery 4 on CentOS 7. My Celery Beat and Celery Worker instances are controlled by supervisor and I'm using the django celery database scheduler. I've scheduled a cron style task using the cronsheduler in Django-admin. When I start celery beat and worker instances the job fires as expected. But if a change the schedule time in Django-admin then the changes are not picked up unless I restart the celery-beat instance. Is there something I am

how to detect failure and auto restart celery worker

只愿长相守 提交于 2021-02-05 08:35:59
问题 I use Celery and Celerybeat in my django powered website. the server OS is Ubuntu 16.04. by using celerybeat, a job is done by a celery worker every 10 minutes. sometimes the worker shuts down without any useful log messages or errors. So, I want to find a way in order to detect status (On/Off) of celery worker (not Beat), and if it's stopped, restart it automatically. how can I do that? thanks 回答1: In production, you should run Celery, Beat, your APP server etc. as daemons [1] using

How to check if celery task is running or not from django templates

故事扮演 提交于 2021-01-29 10:18:04
问题 I need some help for implementing django celery properly Q1: Set custom id for celery task @shared_task def lazy_post_link_1_task(post_url, current_user, no_of_lazy_bot, no_of_comment_for_lazy_bot, lazy_bot_time_interval): instagram_bot = InstagramBot() lazy_bots = InstagramModel.objects.filter(Q(bot_type='lazy_bot') & Q(running_status='idle'))[ :int(no_of_lazy_bot)] for bot in lazy_bots: lazy_bot_filter_comments = Comments.objects.all().exclude(botscomment__bot_id=bot.id)[ :int(no_of_comment

Is it possible to use django-celery-beat with django-tenant?

穿精又带淫゛_ 提交于 2021-01-28 21:09:36
问题 I am using celery for scheduling tasks. So far everything was fine, including hosted on AWS. However, I decided to transform my single application to multi tenant, using django-tenant. That way, I can create the subdomains perfectly. ./manage.py create_tenant However, when running the command celery -A myproject worker -B , despite not showing me any error, It seems to me that he cannot run for the created schema (test with only one created). I tried to specify the schema, using python manage

Django Celery Group tasks executing only the first task

余生颓废 提交于 2021-01-27 08:53:31
问题 I have celery which has different tasks and has one queue.These tasks are not however all called at once depending on the request from the user the tasks called vary. So i have written a code that identifies which tasks to run and creates subtasks with parameters and create a list of them.Add this list to group and use apply_async() on the group to run these tasks. The code for calling the tasks is as follows: tasks_list = [] for provider_name in params['providers']: provider = Provider

Django Celery Group tasks executing only the first task

。_饼干妹妹 提交于 2021-01-27 08:53:05
问题 I have celery which has different tasks and has one queue.These tasks are not however all called at once depending on the request from the user the tasks called vary. So i have written a code that identifies which tasks to run and creates subtasks with parameters and create a list of them.Add this list to group and use apply_async() on the group to run these tasks. The code for calling the tasks is as follows: tasks_list = [] for provider_name in params['providers']: provider = Provider

Can't pickle : attribute lookup builtin.function failed

落爺英雄遲暮 提交于 2021-01-27 05:19:55
问题 I'm getting the error below, the error only happens when I add delay to process_upload function, otherwise it works without a problem. Could someone explain what this error is, why its happening and any recommendations to resolve? Error: PicklingError at /contacts/upload/configurator/47/ Can't pickle <type 'function'>: attribute lookup __builtin__.function failed This is the view if request.method == 'POST': form = ConfiguratorForm(data=request.POST) # Send import to task. process_upload