Python multiprocessing Queue failure

后端 未结 3 1132
旧时难觅i
旧时难觅i 2021-02-04 15:12

I create 100 child processes

proc_list = [
    Process(target = simulator, args=(result_queue,))
    for i in xrange(100)]

and start them

3条回答
  •  孤独总比滥情好
    2021-02-04 15:54

    There's no evidence from the OP post that multiprocessing.Queue does not work. The code posted by the OP is not at all sufficient to understand what's going on: do they join all the processes? do they correctly pass the queue to the child processes (has to be as a parameter if it's on Windows)? do their child processes verify that they actually got 10000 tuples? etc.

    There's a chance that the OP is really encountering a hard-to-reproduce bug in mp.Queue, but given the amount of testing CPython has gone through, and the fact that I just ran 100 processes x 10000 results without any trouble, I suspect the OP actually had some problem in their own code.

    Yes, Manager().Queue() mentioned in other answers is a perfectly fine way to share data, but there's no reason to avoid multiprocessing.Queue() based on unconfirmed reports that "something is wrong with it".

提交回复
热议问题