Python multi-processing

前端 未结 2 733
南方客
南方客 2021-01-15 03:46

I have a large list containing binary encoded strings that I used to process in a single function before, like so:

\"\"\" just included this to demonstrate t         


        
2条回答
  •  礼貌的吻别
    2021-01-15 04:25

    Something like this should work

    Note that pool.map takes a function and a list of parameters for that function for each call. In your original example you are just calling it in the numpy_array function.

    The function must only have one argument, hence the packing of the arguments into a tuple and the rather odd looking double brackets in decode (which is called tuple unpacking).

    def numpy_array(data, peaks):
        processors=4
        pool = mp.Pool(processes=processors)
        chunk_size=len(data)/processors
        print range(processors)
        map_parameters = [] # new
        for i in range(processors):
            counter = i*chunk_size
            chunk=peaks[i*chunk_size:(i+1)*chunk_size-1]
            map_parameters.append((data,chunk,counter)) # new
        pool.map(decode, map_parameters) # new
    
    def decode((data,chunk,counter)): # changed
        for x in chunk:
            peak_counter=0
            data_buff=base64.b64decode(x)
            buff_size=len(data_buff)/4
            unpack_format=">%dL" % buff_size
            index=0
            for y in struct.unpack(unpack_format,data_buff):
                buff1=struct.pack("I",y)
                buff2=struct.unpack("f",buff1)[0]
                if (index % 2 == 0):
                    data[counter][1][peak_counter][0]=float(buff2)
                else:
                    data[counter][1][peak_counter][1]=float(buff2)
                    peak_counter+=1
                index+=1
            print data[counter][1][10][0]
            counter+=1
    

提交回复
热议问题