I have a python script which launches several processes. Each process basically just calls a shell script:
from multiprocessing import Process
import os
imp
In Python 3.3, the subprocess module supports a timeout: http://docs.python.org/dev/library/subprocess.html
For other solutions regarding Python 2.x, please have a look in this thread: Using module 'subprocess' with timeout
Based on Stop reading process output in Python without hang?:
import os
import time
from subprocess import Popen
def start_process(n, stdout):
# no need for `global logger` you don't assign to it
command = [os.path.expanduser("~/Scripts/run.sh"), str(n)]
logger.debug(command) # no need for if(debug); set logging level instead
return Popen(command, stdout=stdout) # run directly
# no need to use threads; Popen is asynchronous
with open('/tmp/scripts_output.txt') as file:
processes = [start_process(i, file) for i in range(10)]
# wait at most timeout seconds for the processes to complete
# you could use p.wait() and signal.alarm or threading.Timer instead
endtime = time.time() + timeout
while any(p.poll() is None for p in processes) and time.time() < endtime:
time.sleep(.04)
# terminate unfinished processes
for p in processes:
if p.poll() is None:
p.terminate()
p.wait() # blocks if `kill pid` is ignored
Use p.wait(timeout)
if it is available.
Use subprocess instead, whose objects have a terminate() method explicitly for this.
You should use an event to signal the worker to terminate, run the subprocess with subprocess
module, then terminate it with Popen.terminate()
. Calling Process.terminate()
will not allow it worker to clean up. See the documentation for Process.terminate().