How can I recover the return value of a function passed to multiprocessing.Process?

前端 未结 12 1609
野的像风
野的像风 2020-11-22 07:36

In the example code below, I\'d like to recover the return value of the function worker. How can I go about doing this? Where is this value stored?

相关标签:
12条回答
  • 2020-11-22 08:08

    A simple solution:

    import multiprocessing
    
    output=[]
    data = range(0,10)
    
    def f(x):
        return x**2
    
    def handler():
        p = multiprocessing.Pool(64)
        r=p.map(f, data)
        return r
    
    if __name__ == '__main__':
        output.append(handler())
    
    print(output[0])
    

    Output:

    [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
    
    0 讨论(0)
  • 2020-11-22 08:10

    I modified vartec's answer a bit since I needed to get the error codes from the function. (Thanks vertec!!! its an awesome trick)

    This can also be done with a manager.list but I think is better to have it in a dict and store a list within it. That way, way we keep the function and the results since we can't be sure of the order in which the list will be populated.

    from multiprocessing import Process
    import time
    import datetime
    import multiprocessing
    
    
    def func1(fn, m_list):
        print 'func1: starting'
        time.sleep(1)
        m_list[fn] = "this is the first function"
        print 'func1: finishing'
        # return "func1"  # no need for return since Multiprocess doesnt return it =(
    
    def func2(fn, m_list):
        print 'func2: starting'
        time.sleep(3)
        m_list[fn] = "this is function 2"
        print 'func2: finishing'
        # return "func2"
    
    def func3(fn, m_list):
        print 'func3: starting'
        time.sleep(9)
        # if fail wont join the rest because it never populate the dict
        # or do a try/except to get something in return.
        raise ValueError("failed here")
        # if we want to get the error in the manager dict we can catch the error
        try:
            raise ValueError("failed here")
            m_list[fn] = "this is third"
        except:
            m_list[fn] = "this is third and it fail horrible"
            # print 'func3: finishing'
            # return "func3"
    
    
    def runInParallel(*fns):  # * is to accept any input in list
        start_time = datetime.datetime.now()
        proc = []
        manager = multiprocessing.Manager()
        m_list = manager.dict()
        for fn in fns:
            # print fn
            # print dir(fn)
            p = Process(target=fn, name=fn.func_name, args=(fn, m_list))
            p.start()
            proc.append(p)
        for p in proc:
            p.join()  # 5 is the time out
    
        print datetime.datetime.now() - start_time
        return m_list, proc
    
    if __name__ == '__main__':
        manager, proc = runInParallel(func1, func2, func3)
        # print dir(proc[0])
        # print proc[0]._name
        # print proc[0].name
        # print proc[0].exitcode
    
        # here you can check what did fail
        for i in proc:
            print i.name, i.exitcode  # name was set up in the Process line 53
    
        # here will only show the function that worked and where able to populate the 
        # manager dict
        for i, j in manager.items():
            print dir(i)  # things you can do to the function
            print i, j
    
    0 讨论(0)
  • 2020-11-22 08:13

    Use shared variable to communicate. For example like this:

    import multiprocessing
    
    
    def worker(procnum, return_dict):
        """worker function"""
        print(str(procnum) + " represent!")
        return_dict[procnum] = procnum
    
    
    if __name__ == "__main__":
        manager = multiprocessing.Manager()
        return_dict = manager.dict()
        jobs = []
        for i in range(5):
            p = multiprocessing.Process(target=worker, args=(i, return_dict))
            jobs.append(p)
            p.start()
    
        for proc in jobs:
            proc.join()
        print(return_dict.values())
    
    0 讨论(0)
  • 2020-11-22 08:15

    If you are using Python 3, you can use concurrent.futures.ProcessPoolExecutor as a convenient abstraction:

    from concurrent.futures import ProcessPoolExecutor
    
    def worker(procnum):
        '''worker function'''
        print(str(procnum) + ' represent!')
        return procnum
    
    
    if __name__ == '__main__':
        with ProcessPoolExecutor() as executor:
            print(list(executor.map(worker, range(5))))
    

    Output:

    0 represent!
    1 represent!
    2 represent!
    3 represent!
    4 represent!
    [0, 1, 2, 3, 4]
    
    0 讨论(0)
  • 2020-11-22 08:16

    This example shows how to use a list of multiprocessing.Pipe instances to return strings from an arbitrary number of processes:

    import multiprocessing
    
    def worker(procnum, send_end):
        '''worker function'''
        result = str(procnum) + ' represent!'
        print result
        send_end.send(result)
    
    def main():
        jobs = []
        pipe_list = []
        for i in range(5):
            recv_end, send_end = multiprocessing.Pipe(False)
            p = multiprocessing.Process(target=worker, args=(i, send_end))
            jobs.append(p)
            pipe_list.append(recv_end)
            p.start()
    
        for proc in jobs:
            proc.join()
        result_list = [x.recv() for x in pipe_list]
        print result_list
    
    if __name__ == '__main__':
        main()
    

    Output:

    0 represent!
    1 represent!
    2 represent!
    3 represent!
    4 represent!
    ['0 represent!', '1 represent!', '2 represent!', '3 represent!', '4 represent!']
    

    This solution uses fewer resources than a multiprocessing.Queue which uses

    • a Pipe
    • at least one Lock
    • a buffer
    • a thread

    or a multiprocessing.SimpleQueue which uses

    • a Pipe
    • at least one Lock

    It is very instructive to look at the source for each of these types.

    0 讨论(0)
  • 2020-11-22 08:16

    You can use the exit built-in to set the exit code of a process. It can be obtained from the exitcode attribute of the process:

    import multiprocessing
    
    def worker(procnum):
        print str(procnum) + ' represent!'
        exit(procnum)
    
    if __name__ == '__main__':
        jobs = []
        for i in range(5):
            p = multiprocessing.Process(target=worker, args=(i,))
            jobs.append(p)
            p.start()
    
        result = []
        for proc in jobs:
            proc.join()
            result.append(proc.exitcode)
        print result
    

    Output:

    0 represent!
    1 represent!
    2 represent!
    3 represent!
    4 represent!
    [0, 1, 2, 3, 4]
    
    0 讨论(0)
提交回复
热议问题