Python multi-processing

前端 未结 2 737
南方客
南方客 2021-01-15 03:46

I have a large list containing binary encoded strings that I used to process in a single function before, like so:

\"\"\" just included this to demonstrate t         


        
2条回答
  •  失恋的感觉
    2021-01-15 04:18

    The bug is in your numpy_array function:

    for i in range(processors):
        counter = i*chunk_size
        chunk=peaks[i*chunk_size:(i+1)*chunk_size-1]
        pool.map(decode(data,chunk,counter))
    

    The problem is that you're calling map sequentially so you're only running one process at a time. Also, I don't think you're calling map correctly as you're doing pool.map(f(*args)) when the signature is map(f, ['list', 'of', 'data']).

    I would use a partial so that you don't create copies of data as I assume that array is quite large or could be larger in the future.

    This should be:

    import functools
    decode_with_data = functools.partial(decode, data)
    args = []
    for i in range(processors):
        counter = i * chunk_size
        chunk = peaks[1*chunk_size:(i+1)*chunk_size-1]
        args.append(chunk, counter)
    pool.map(decode_with_data, args)
    

提交回复
热议问题