Python multiprocessing Queue failure

后端 未结 3 1131
旧时难觅i
旧时难觅i 2021-02-04 15:12

I create 100 child processes

proc_list = [
    Process(target = simulator, args=(result_queue,))
    for i in xrange(100)]

and start them

相关标签:
3条回答
  • 2021-02-04 15:46

    My solution to multiprocessing issues is almost always to use the Manager objects. While the exposed interface is the same, the underlying implementation is much simpler and has less bugs.

    from multiprocessing import Manager
    manager = Manager()
    result_queue = manager.Queue()
    

    Try it out and see if it doesn't fix your issues.

    0 讨论(0)
  • 2021-02-04 15:54

    There's no evidence from the OP post that multiprocessing.Queue does not work. The code posted by the OP is not at all sufficient to understand what's going on: do they join all the processes? do they correctly pass the queue to the child processes (has to be as a parameter if it's on Windows)? do their child processes verify that they actually got 10000 tuples? etc.

    There's a chance that the OP is really encountering a hard-to-reproduce bug in mp.Queue, but given the amount of testing CPython has gone through, and the fact that I just ran 100 processes x 10000 results without any trouble, I suspect the OP actually had some problem in their own code.

    Yes, Manager().Queue() mentioned in other answers is a perfectly fine way to share data, but there's no reason to avoid multiprocessing.Queue() based on unconfirmed reports that "something is wrong with it".

    0 讨论(0)
  • 2021-02-04 15:59

    The multiprocessing.Queue is said to be thread-safe in its documentations. But when you are doing inter-process communications with Queue, it should be used with multiprocessing.Manager().Queue()

    0 讨论(0)
提交回复
热议问题