问题
Do you know/use any distributed job queue for python? Can you share links or tools
回答1:
In addition to multiprocessing there's also the Celery project, if you're using Django.
回答2:
Pyres is a resque clone built in python. Resque is used by Github as their message queue. Both use Redis as the queue backend and provide a web-based monitoring application.
http://binarydud.github.com/pyres/intro.html
回答3:
There's also "bucker" by Sylvain Hellegouarch which you can find here:
- http://trac.defuze.org/wiki/bucker
It describes itself like this:
- bucker is a queue system that supports multiple storage for the queue (memcached, Amazon SQS for now) and is driven by XML messages sent over a TCP connections between a client and the queue server.
回答4:
Look at beanstalkd
回答5:
redqueue? It's implemented in python+tornado framework, speaks memcached protocol and is optionally persistent into log files. Currently it is also able to behave like beanstalkd, the reserve/delete way in memcache protocol as well.
REDQUEUE
回答6:
If you think that Celery is too heavy for your needs then you might want to look at the simple distributed task queue:
- https://github.com/rojkov/taskqueue
- http://simpletaskqueue.readthedocs.org/
回答7:
It's a year late or whatever, but this is something I've hacked together to make a queue of Processes executing them only X number at a time. http://github.com/goosemo/job_queue
回答8:
You probably want to look at multiprocessing's Queue. Included in Python 2.6, get it on PyPI for earlier versions of Python.
Standard library documentation: http://docs.python.org/library/multiprocessing.html On PyPI: http://pypi.python.org/pypi/multiprocessing
回答9:
Also there is Unix 'at'
For more info: man at
来源:https://stackoverflow.com/questions/1336489/job-queue-implementation-for-python