Python multiprocessing global numpy arrays

前端 未结 1 1525
眼角桃花
眼角桃花 2021-01-24 01:41

I have following script:

max_number = 100000
minimums = np.full((max_number), np.inf, dtype=np.float32)
data = np.zeros((max_number, 128, 128, 128), dtype=np.uin         


        
相关标签:
1条回答
  • 2021-01-24 01:52
    import numpy as np
    import multiprocessing as mp
    
    ar = np.zeros((5,5))
    
    def callback_function(result):
        x,y,data = result
        ar[x,y] = data
    
    def worker(num):
        data = ar[num,num]+3
        return num, num, data
    
    def apply_async_with_callback():
        pool = mp.Pool(processes=5)
        for i in range(5):
            pool.apply_async(worker, args = (i, ), callback = callback_function)
        pool.close()
        pool.join()
        print "Multiprocessing done!"
    
    if __name__ == '__main__':
        ar = np.ones((5,5)) #This will be used, as local scope comes before global scope
        apply_async_with_callback()
    

    Explanation: You set up your data array and your workers and callback functions. The number of processes in the pool set up a number of independent workers, where each worker can do more than one task. The callback writes the result back to the array.

    The __name__=='__main__' protects the following line from being run at each import.

    0 讨论(0)
提交回复
热议问题