How to efficiently do many tasks a “little later” in Python?

后端 未结 10 902
心在旅途
心在旅途 2021-01-30 11:59

I have a process, that needs to perform a bunch of actions \"later\" (after 10-60 seconds usually). The problem is that those \"later\" actions can be a lot (1000s), so using a

10条回答
  •  余生分开走
    2021-01-30 12:30

    You wrote:

    one of the problem is that the process uses zeromq for communication so I would need some integration (eventlet already has it)

    Seems like your choice will be heavily influenced by these details, which are a bit unclear—how is zeromq being used for communication, how much resources will the integration will require, and what are your requirements and available resources.


    There's a project called django-ztask which uses zeromq and provides a task decorator similar to celery's one. However, it is (obviously) Django-specific and so may not be suitable in your case. I haven't used it, prefer celery myself.

    Been using celery for a couple of projects (these are hosted at ep.io PaaS hosting, which provides an easy way to use it).

    Celery looks like quite flexible solution, allowing delaying tasks, callbacks, task expiration & retrying, limiting task execution rate, etc. It may be used with Redis, Beanstalk, CouchDB, MongoDB or an SQL database.

    Example code (definition of task and asynchronous execution after a delay):

    from celery.decorators import task
    
    @task
    def my_task(arg1, arg2):
        pass # Do something
    
    result = my_task.apply_async(
        args=[sth1, sth2], # Arguments that will be passed to `my_task()` function.
        countdown=3, # Time in seconds to wait before queueing the task.
    )
    

    See also a section in celery docs.

提交回复
热议问题