Can I create a shared multiarray or lists of lists object in python for multiprocessing?

后端 未结 2 753
独厮守ぢ
独厮守ぢ 2020-12-13 01:04

I need to make a shared object of a multidimensional array or list of lists for it to be available to the other processes. Is there a way to create it as for what i have see

相关标签:
2条回答
  • 2020-12-13 01:39

    To make a numpy array a shared object (full example):

    import ctypes as c
    import numpy as np
    import multiprocessing as mp
    
    n, m = 2, 3
    mp_arr = mp.Array(c.c_double, n*m) # shared, can be used from multiple processes
    # then in each new process create a new numpy array using:
    arr = np.frombuffer(mp_arr.get_obj()) # mp_arr and arr share the same memory
    # make it two-dimensional
    b = arr.reshape((n,m)) # b and arr share the same memory
    

    If you don't need a shared (as in "share the same memory") object and a mere object that can be used from multiple processes is enough then you could use multiprocessing.Manager:

    from multiprocessing import Process, Manager
    
    def f(L):
        row = L[0] # take the 1st row
        row.append(10) # change it
        L[0] = row #NOTE: important: copy the row back (otherwise parent
                   #process won't see the changes)
    
    if __name__ == '__main__':
        manager = Manager()
    
        lst = manager.list()
        lst.append([1])
        lst.append([2, 3])
        print(lst) # before: [[1], [2, 3]]
    
        p = Process(target=f, args=(lst,))
        p.start()
        p.join()
    
        print(lst) # after: [[1, 10], [2, 3]]
    

    From the docs:

    Server process managers are more flexible than using shared memory objects because they can be made to support arbitrary object types. Also, a single manager can be shared by processes on different computers over a network. They are, however, slower than using shared memory.

    0 讨论(0)
  • 2020-12-13 01:42

    Why not create a list of Arrays?

     arrays = [Array('i', range(10))] * 10
    
    0 讨论(0)
提交回复
热议问题