问题
I'm using python-rq to manage Redis-based jobs and I want to determine which jobs are currently being processed by my workers.
python-rq offers a get_current_job function to find 'the current job' for a connection but:
- I can't get this to work, and
- I really want a list of all of the jobs which are being currently processed by all workers on all the queues for this connection rather than one job from one queue.
Here is my code (which always returns None):
from rq import Queue, get_current_job
redis_url = os.getenv('REDIS_FOO')
parse.uses_netloc.append('redis')
url = parse.urlparse(redis_url)
conn = Redis(host=url.hostname, port=url.port, db=0, password=url.password)
q = Queue(connection=conn)
get_current_job(connection=conn)
Does anyone have any ideas, please, on getting the above code to work but, more importantly, on a way to get a list of all current jobs from all workers on all queues from this connection?
回答1:
Looked into some source code, I figure this what you need:
There is one more thing you should notice: the number of the running jobs is equal to the number of rq worker. Because worker only process one job at a time.
from rq import Queue
from redis import Redis
from rq.registry import StartedJobRegistry
from jobs import count_words_at_url
redis_conn = Redis()
q = Queue('default', connection=redis_conn)
for i in range(5000):
job = q.enqueue(count_words_at_url, 'http://nvie.com', ttl=43)
registry = StartedJobRegistry('default', connection=redis_conn)
running_job_ids = registry.get_job_ids() # Jobs which are exactly running.
expired_job_ids = registry.get_expired_job_ids()
来源:https://stackoverflow.com/questions/45667458/get-all-current-jobs-from-python-rq