Producer Consumer using semaphores and mutexes in Python

社会主义新天地 提交于 2019-12-24 03:52:40

问题


I'm trying to understand how to implement a Queue with a bounded buffer size that can be used by multiple producers and consumers using Python Semaphores. Here's my implementation:

class Q:

    def __init__(self, size):
        self.buff = [None]*size
        self.end = 0
        self.start = 0
        self.size = size
        self.end_lock = Lock()  # protect end from race across multiple producers
        self.start_lock = Lock()  # protect start from race across multiple consumers
        self.open = Semaphore(size)  # block till there's space to produce
        self.closed = Semaphore(size) # block till there's item to consume
        for _ in range(size):  # initialize with all closed acquired so that consumer is blocked
            self.closed.acquire()

    def put(self, val):
        self.open.acquire()
        with self.end_lock:
            self.buff[self.end] = val
            self.end = (self.end+1)%self.size
        self.closed.release()

    def get(self):
        self.closed.acquire()
        with self.start_lock:
            val = self.buff[(self.start)%self.size]
            self.start = (self.start+1)%self.size
        self.open.release()
        return val

Is this implementation bug-free? Could this be simplified further to use fewer mutexes/semaphores?


回答1:


Looks good to me. The semaphores prevent concurrent producers and consumers from writing and reading too much and the locks prevent concurrent producers or consumers from modifying the end or start indices simultaneously.

The two semaphores are definitely necessary. You could remove one of the locks and use it in both get and put to protect both the start and the end index which wouldn't allow consumers and producers to access the queue simultaneously. (CPython's queue implementation does this.)


I would remove the size attribute in favor of len(self.buff) though and rename the start and end indices to read_index and write_index respectively (and the locks as well). Also, I think you could access the buffer without holding the locks (because lists themselves are thread-safe):

    def put(self, val):
        self.open.acquire()
        with self.write_lock:
            index = self.write_index
            self.write_index = (self.write_index + 1) % len(self.buff)
        self.buff[index] = val
        self.closed.release()

    def get(self):
        self.closed.acquire()
        with self.read_lock:
            index = self.read_index
            self.read_index = (self.read_index + 1) % len(self.buff)
        val = self.buff[index]
        self.open.release()
        return val

Here's a small test program I used to play around:

def producer(queue, start, end, step):
    for value in range(start, end, step):
        queue.put(value)
    print('Producer finished')


def consumer(queue, count, result, lock):
    local_result = []
    for _ in range(count):
        local_result.append(queue.get())
    with lock:
        result.update(local_result)
    print('Consumer finished')


def main():
    value_count = 500000
    producer_count = 50
    consumer_count = 50
    assert value_count % producer_count == 0
    assert value_count % consumer_count == 0

    queue = Queue(123)
    result = set()
    lock = Lock()
    producers = [Thread(target=producer, args=(queue, i, value_count, producer_count)) for i in range(producer_count)]
    consumers = [Thread(target=consumer, args=(queue, value_count // consumer_count, result, lock)) for _ in range(consumer_count)]

    for p in producers:
        p.start()
    for c in consumers:
        c.start()

    for p in producers:
        p.join()
    for c in consumers:
        c.join()

    if len(result) != value_count:
        raise ValueError('Result size is %d instead of %d' % (len(result), value_count))


if __name__ == '__main__':
    main()


来源:https://stackoverflow.com/questions/58035571/producer-consumer-using-semaphores-and-mutexes-in-python

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!