问题
I observed this behavior when trying to create nested child processes in Python. Here is the parent program parent_process.py
:
import multiprocessing
import child_process
pool = multiprocessing.Pool(processes=4)
for i in range(4):
pool.apply_async(child_process.run, ())
pool.close()
pool.join()
The parent program calls the "run" function in the following child program child_process.py:
import multiprocessing
def run():
pool = multiprocessing.Pool(processes=4)
print 'TEST!'
pool.close()
pool.join()
When I run the parent program, nothing was printed out and the program exited quickly. However, if print 'TEST!'
is moved one line above (before the nested child processes are created), 'TEST!'
are printed for 4 times.
Because errors in a child process won't print to screen, this seems to show that the program crashes when a child process creates its own nested child processes.
Could anyone explain what happens behind the scene? Thanks!
回答1:
According to multiprocessing documentation, daemonic processes cannot spawn child processes.
multiprocessing.Pool
uses daemonic processes to ensure they don't leak when your program extits.
回答2:
As noxdafox said, multiprocessing.Pool
uses daemonic processes. I found a simple workaround that uses multiprocess.Process
instead:
Parent program:
import multiprocessing
import child_process
processes = [None] * 4
for i in range(4):
processes[i] = multiprocessing.Process(target=child_process.run, args=(i,))
processes[i].start()
for i in range(4):
processes[i].join()
Child program (with name child_process.py
):
import multiprocessing
def test(info):
print 'TEST', info[0], info[1]
def run(proc_id):
pool = multiprocessing.Pool(processes=4)
pool.map(test, [(proc_id, i) for i in range(4)])
pool.close()
pool.join()
The output is 16 lines of TEST
:
TEST 0 0
TEST 0 1
TEST 0 3
TEST 0 2
TEST 2 0
TEST 2 1
TEST 2 2
TEST 2 3
TEST 3 0
TEST 3 1
TEST 3 3
TEST 3 2
TEST 1 0
TEST 1 1
TEST 1 2
TEST 1 3
回答3:
I do not have enough reputation to post a comment, but since python version determines the options for running hierarchical multiprocessing (e.g., a post from 2015), I wanted to share my experience. The above solution by Da Kuang worked for me with python 3.7.1 running through Anaconda 3.
I made a small modification to child_process.py to make it run the cpu for a little while so I could check system monitor to verify 16 simultaneous processes were running.
import multiprocessing
def test(info):
print('TEST', info[0], info[1])
aa=[1]*100000
a=[1 for i in aa if all([ii<1 for ii in aa])]
print('exiting')
def run(proc_id):
pool = multiprocessing.Pool(processes=4)
pool.map(test, [(proc_id, i) for i in range(4)])
pool.close()
pool.join()
来源:https://stackoverflow.com/questions/32688946/create-child-processes-inside-a-child-process-with-python-multiprocessing-failed