Multiprocessing an iterable in python

一笑奈何 提交于 2019-12-02 04:52:29

I would suggest using queues to dump your iterables. Something like that:

import multiprocessing as mp
import numpy as np
import itertools as it


def worker(in_queue, out_queue):
    check = 0.915
    for a in iter(in_queue.get, 'STOP'):
        A = a[0]
        B = a[1]
        test = (sum(B)+10)/(sum(A)+12)
        if test > check:
            out_queue.put([A,B])
        else:
            out_queue.put('')

if __name__ == "__main__":
    wmod = np.array([[0,1,2],[3,4,5],[6,7,3]])
    pmod = np.array([[0,1,2],[3,4,5],[6,7,3]])

    plines1 = it.product(wmod[0],wmod[1],wmod[2])
    plines2 = it.product(pmod[0],pmod[1],pmod[2])

    # determine length of your iterator
    counts = 26

    # setup iterator
    it = zip(plines1,plines2)

    in_queue = mp.Queue()
    out_queue = mp.Queue()

    # setup workers
    numProc = 2
    process = [mp.Process(target=worker,
                          args=(in_queue, out_queue), daemon=True) for x in range(numProc)]

    # run processes
    for p in process:
        p.start()

    results = []
    control = True

    # fill queue and get data
    # code fills the queue until a new element is available in the output
    # fill blocks if no slot is available in the in_queue
    for idx in range(counts):
        while out_queue.empty() and control:
            # fill the queue
            try:
                in_queue.put(next(it), block=True) 
            except StopIteration:
                # signals for processes stop
                for p in process:
                    print('stopping')
                    in_queue.put('STOP')
                control = False
                break
        results.append(out_queue.get(timeout=10))

    # wait for processes to finish
    for p in process:
        p.join()

    print(results)

    print('finished')

However, you would have to determine first how long your task list will be.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!