I am experimenting multiprocessing in Python and tried to share an Array of strings among two processes. Here is my python code :
from multiprocessing import
I don't have a solution, but I can add more hints.
I've stripped the script to pinpoint the problem. It looks to me that the problem is in the l * 3
operation. I don't know why, but moving the l * 3
operation in the enumerator
works as expected:
from multiprocessing import Process, Array
import ctypes
def f1(a):
# for i, l in enumerate(['a', 'b', 'c']):
# a[i] = (l * 3)
for i, l in enumerate(['a' * 3, 'b' * 3, 'c' * 3]):
a[i] = l
print "f1 : ", map(id, a), a[:]
if __name__ == '__main__':
arr = Array(ctypes.c_char_p, 3)
print "Before :", map(id, arr), arr[:]
p = Process(target=f1, args=(arr, ))
p.start()
p.join()
print "After : ", map(id, arr), arr[:]
Result:
Before : [3077673516L, 3077673516L, 3077673516L] [None, None, None]
f1 : [3073497784L, 3073497784L, 3073497784L] ['aaa', 'bbb', 'ccc']
After : [3073497784L, 3073497784L, 3073497784L] ['aaa', 'bbb', 'ccc']
My guess is:
arr
stores 3 pointers.f1()
assigns them to memory addresses that have no meaning outside current process.f2()
tries to access the meaningless addresses that contain junk at this point.
Assigning to values that have meaning in all processes seems to help:
from __future__ import print_function
import ctypes
import time
from multiprocessing import Process, Array, Value
values = [(s*4).encode('ascii') for s in 'abc']
def f1(a, v):
for i, s in enumerate(values):
a[i] = s
v.value += 1
print("f1 : ", a[:], v.value)
def f2(a,v):
v.value += 1
print("f2 : ", a[:], v.value)
def main():
val = Value(ctypes.c_int, 0)
arr = Array(ctypes.c_char_p, 3)
print("Before :", arr[:], val.value)
p = Process(target=f1, args=(arr, val))
p2 = Process(target=f2, args=(arr, val))
p.start()
p2.start()
p.join()
p2.join()
print("After : ", arr[:], val.value)
if __name__ == '__main__':
main()
Before : [None, None, None] 0
f1 : ['aaaa', 'bbbb', 'cccc'] 1
f2 : ['aaaa', 'bbbb', 'cccc'] 2
After : ['aaaa', 'bbbb', 'cccc'] 2