I am learning how to use the Python multiprocessing library. However, while I am going through some of the examples, I ended up with many python processes running in my back
You need to .join()
on your processes in a worker Queue, which will lock them to the calling application until all of them succeed or kill when the parent is killed, and run them in daemon mode.
http://forums.xkcd.com/viewtopic.php?f=11&t=94726
end daemon processes with multiprocessing module
http://docs.python.org/2/library/multiprocessing.html#the-process-class
http://www.python.org/dev/peps/pep-3143/#correct-daemon-behaviour
The answer pointed by Blake VandeMerwe is listed and explained below hope could be helpful for other users:
Original Author:
kill -9 `ps -ef | grep test.py | grep -v grep | awk '{print $2}'`
Explaination:
"ps -ef": show all the processes including those without controlling terminals, which are exactly the countless processes generated by MULTIPROCESSING library.
"grep test.py": find all the processes which are generated by this script, which is the name of my python script.
"grep -v grep": excluded the grep operation itself from the 'killing list'
"awk '{print $2}'": using AWK to separate every single records into row and print out the second row which in this case, are the process id colum.
"kill -9" is force kill process, arguments should be UID. The complete output of previous steps are put together by "`", which is the character on the left of number 1 on regular keyboard. which treat them as a variable and pass the value to kill.