multiprocessing: How do I share a dict among multiple processes?

后端 未结 5 497
执念已碎
执念已碎 2020-11-22 11:49

A program that creates several processes that work on a join-able queue, Q, and may eventually manipulate a global dictionary D to store results. (

5条回答
  •  孤街浪徒
    2020-11-22 12:42

    In addition to @senderle's here, some might also be wondering how to use the functionality of multiprocessing.Pool.

    The nice thing is that there is a .Pool() method to the manager instance that mimics all the familiar API of the top-level multiprocessing.

    from itertools import repeat
    import multiprocessing as mp
    import os
    import pprint
    
    def f(d: dict) -> None:
        pid = os.getpid()
        d[pid] = "Hi, I was written by process %d" % pid
    
    if __name__ == '__main__':
        with mp.Manager() as manager:
            d = manager.dict()
            with manager.Pool() as pool:
                pool.map(f, repeat(d, 10))
            # `d` is a DictProxy object that can be converted to dict
            pprint.pprint(dict(d))
    

    Output:

    $ python3 mul.py 
    {22562: 'Hi, I was written by process 22562',
     22563: 'Hi, I was written by process 22563',
     22564: 'Hi, I was written by process 22564',
     22565: 'Hi, I was written by process 22565',
     22566: 'Hi, I was written by process 22566',
     22567: 'Hi, I was written by process 22567',
     22568: 'Hi, I was written by process 22568',
     22569: 'Hi, I was written by process 22569',
     22570: 'Hi, I was written by process 22570',
     22571: 'Hi, I was written by process 22571'}
    

    This is a slightly different example where each process just logs its process ID to the global DictProxy object d.

提交回复
热议问题