Celery / Django Single Tasks being run multiple times

后端 未结 1 1556
梦谈多话
梦谈多话 2021-01-13 15:18

I\'m facing an issue where I\'m placing a task into the queue and it is being run multiple times. From the celery logs I can see that the same worker is running the task ...

相关标签:
1条回答
  • 2021-01-13 16:18

    I don't have an exact answer for you, but there are a few things you should look into:

    • djcelery is deprecated, so if you are using new version of celery there may be some sort of conflict.

    • If your input app is listed in INSTALLED_APPS celery will discover it, so you don't need to add it to CELERY_IMPORTS = ("input.tasks", ), which maybe the cause of your problem, since tasks could be loaded multiple times

    • try giving your task a name @task(name='input.tasks.add'), it will know that it is the same task, no matter how you import it.

    Looking at your setting it looks like you are using an old version of celery, or you are using you old configuration for new version of celery. In any case make sure you have newest version and try this configuration instead of what you have:

    BROKER_URL = 'amqp://<user>:<password>@localhost:5672/<vhost>'
    CELERY_RESULT_BACKEND = 'amqp'
    CELERY_ACCEPT_CONTENT = ['json']
    CELERY_TASK_SERIALIZER = 'json'
    CELERY_RESULT_SERIALIZER = 'json'
    

    Now, you also will have to configure celery differently:

    Get rid of djcelery stuff completely.

    Create proj/celery.py inside your django project:

    from __future__ import absolute_import
    
    import os
    
    from celery import Celery
    
    from django.conf import settings
    
    # set the default Django settings module for the 'celery' program.
    os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'settings')
    
    app = Celery('proj')
    
    # Using a string here means the worker will not have to
    # pickle the object when using Windows.
    app.config_from_object('django.conf:settings')
    app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
    
    @app.task(bind=True)
    def debug_task(self):
        print('Request: {0!r}'.format(self.request))
    

    In your proj/__init__.py:

    from __future__ import absolute_import
    
    from proj.celery import app as celery_app
    

    Then if your input app is a reusable app and is not part of your project use @shared_task instead of @task decorator.

    Then run celery:

    celery -A proj worker -l info
    

    Hope it helps.

    0 讨论(0)
提交回复
热议问题