multiprocessing

TKinter windows do not appear when using multiprocessing on Linux

隐身守侯 提交于 2020-01-14 13:05:56
问题 I want to spawn another process to display an error message asynchronously while the rest of the application continues. I'm using the multiprocessing module in Python 2.6 to create the process and I'm trying to display the window with TKinter . This code worked okay on Windows, but running it on Linux the TKinter window does not appear if I call 'showerror("MyApp Error", "Something bad happened.")' . It does appear if I run it in the same process by calling showerrorprocess directly. Given

Strange problems when using requests and multiprocessing

≡放荡痞女 提交于 2020-01-14 12:43:51
问题 Please check this python code: #!/usr/bin/env python import requests import multiprocessing from time import sleep, time from requests import async def do_req(): r = requests.get("http://w3c.org/") def do_sth(): while True: sleep(10) if __name__ == '__main__': do_req() multiprocessing.Process( target=do_sth, args=() ).start() When I press Ctrl-C (wait 2sec after run - let Process run), it doesn't stop. When I change the import order to: from requests import async from time import sleep, time

Python multiprocessing.Pool map() “TypeError: string indices must be integers, not str”

和自甴很熟 提交于 2020-01-14 09:23:07
问题 I am attempting to use multiprocessing.Pool to do parallel processing on a list of dictionaries. An example is below ( Please note : this is a toy example, my actual example will be doing cpu-intensive processing on the values in the actual dictionary) import multiprocessing my_list = [{'letter': 'a'}, {'letter': 'b'}, {'letter': 'c'}] def process_list(list_elements): ret_list = [] for my_dict in list_elements: ret_list.append(my_dict['letter']) return ret_list if __name__ == "__main__": pool

how are handles distributed after MPI_Comm_split?

穿精又带淫゛_ 提交于 2020-01-14 04:32:05
问题 Say, i have 8 processes. When i do the following, the MPU_COMM_WORLD communicator will be splitted into two communicators. The processes with even ids will belong to one communicator and the processes with odd ids will belong to another communicator. color=myid % 2; MPI_Comm_split(MPI_COMM_WORLD,color,myid,&NEW_COMM); MPI_Comm_rank( NEW_COMM, &new_id); My question is where is the handle for these two communicators. After the split the ids of processors which before were 0 1 2 3 4 5 6 7 will

python multiprocessing logging: QueueHandler with RotatingFileHandler “file being used by another process” error

余生长醉 提交于 2020-01-14 04:22:31
问题 I'm converting a program to multiprocessing and need to be able to log to a single rotating log from the main process as well as subprocesses. I'm trying to use the 2nd example in the python cookbook Logging to a single file from multiple processes, which starts a logger_thread running as part of the main process, picking up log messages off a queue that the subprocesses add to. The example works well as is, and also works if I switch to a RotatingFileHandler. However if I change it to start

python multiprocessing logging: QueueHandler with RotatingFileHandler “file being used by another process” error

五迷三道 提交于 2020-01-14 04:22:08
问题 I'm converting a program to multiprocessing and need to be able to log to a single rotating log from the main process as well as subprocesses. I'm trying to use the 2nd example in the python cookbook Logging to a single file from multiple processes, which starts a logger_thread running as part of the main process, picking up log messages off a queue that the subprocesses add to. The example works well as is, and also works if I switch to a RotatingFileHandler. However if I change it to start

Python Tkinter multiprocessing progress

二次信任 提交于 2020-01-14 03:55:08
问题 For my work, I frequently have to collect reasonably large datasets from a MySQL database, e.g. several parameters for several locations, and store that data in a CSV file per location. For this, I've written a small GUI. Since, the data has to be stored per location, I thought I'd take advantages of my 8-thread CPU and use the multiprocessing package to query the database per location. This works just fine, but I also want to keep track of how far the data retrieval and file writing is. The

How can I parallelize method calls on an array of objects?

ⅰ亾dé卋堺 提交于 2020-01-14 03:51:14
问题 I have a simulation that consists of a list of objects. I'd like to call a method on all of those objects in parallel, since none of them depends on the other, using a thread pool. You can't pickle a method, so I was thinking of using a wrapper function with a side effect to do something like the following: from multiprocessing import Pool class subcl: def __init__(self): self.counter=1 return def increment(self): self.counter+=1 return def wrapper(targ): targ.increment() return class sim:

multi-threading in python: is it really performance effiicient most of the time?

a 夏天 提交于 2020-01-13 20:20:20
问题 In my little understanding, it is the performance factor that drives programming for multi-threading in most cases but not all. (irrespective of Java or Python). I was reading this enlightening article on GIL in SO. The article summarizes that python adopts GIL mechanism; i.e only a single Thread can execute python byte code at any given time. This makes single thread application really faster. My question is as follows: Since if only one Thread is served at a given point, does

Python Multiprocessing for DataFrame Operations/Functions

旧街凉风 提交于 2020-01-13 19:57:37
问题 I am processing 100,000s of rows of text data using Pandas Dataframes. Every so often (<5 per 100,000) I have an error for a row that I have chosen to drop. The error handling function is as follows: def unicodeHandle(datai): for i, row in enumerate(datai['LDTEXT']): print(i) #print(text) try: text = row.read() text.strip().split('[\W_]+') print(text) except UnicodeDecodeError as e: datai.drop(i, inplace=True) print('Error at index {}: {!r}'.format(i, row)) print(e) return datai The function