How to make worker threads quit after work is finished in a multithreaded producer-consumer pattern?

后端 未结 4 1624
时光说笑
时光说笑 2021-02-05 12:56

I am trying to implement a multithreaded producer-consumer pattern using Queue.Queue in Python 2.7. I am trying to figure out how to make the consumers, i.e. the worker threads,

4条回答
  •  迷失自我
    2021-02-05 13:30

    Can the method of sending a single exit indicator for all threads (as explained in the second comment of https://stackoverflow.com/a/19369877/1175080 by Martin James) even work?

    As you have notice it can't work, spreading the message will make the last thread to update the queue with one more item and since you are waiting for a queue that will never be empty, not with the code you have.

    If the answer to the previous question is "No", is there a way to solve the problem in a way that I don't have to send a separate exit indicator for each worker thread?

    You can join the threads instead of the queue:

    def worker(n, q):
        # n - Worker ID
        # q - Queue from which to receive data
        while True:
            data = q.get()
            print 'worker', n, 'got', data
            time.sleep(1)  # Simulate noticeable data processing time
            q.task_done()
            if data == -1: # -1 is used to indicate that the worker should stop
                # Requeue the exit indicator.
                q.put(-1)
                # Commit suicide.
                print 'worker', n, 'is exiting'
                break
    
    def master():
        # master() sends data to worker() via q.
        q = Queue.Queue()
    
        # Create 3 workers.
        threads = [threading.Thread(target=worker, args=(i, q)) for i in range(3)]
        for t in threads:
            threads.start()
        # Send 10 items to work on.
        for i in range(10):
            q.put(i)
            time.sleep(0.5)
    
        # Send an exit indicator for all threads to consume.
        q.put(-1)
    
        print 'waiting for workers to finish ...'
        for t in threads:
            t.join()
        print 'done'
    
    master()
    

    As the Queue documentation explain get method will rise an execption once its empty so if you know already the data to process you can fill the queue and then spam the threads:

    import Queue
    import threading
    import time
    
    def worker(n, q):
        # n - Worker ID
        # q - Queue from which to receive data
        while True:
            try:
                data = q.get(block=False, timeout=1)
                print 'worker', n, 'got', data
                time.sleep(1)  # Simulate noticeable data processing time
                q.task_done()
            except Queue.Empty:
                break
    
    
    def master():
        # master() sends data to worker() via q.
        q = Queue.Queue()
    
        # Send 10 items to work on.
        for i in range(10):
            q.put(i)
    
        # Create 3 workers.
        for i in range(3):
            t = threading.Thread(target=worker, args=(i, q))
            t.start()
    
        print 'waiting for workers to finish ...'
        q.join()
        print 'done'
    
    master()
    

    Here you have a live example

提交回复
热议问题