I\'ve been trying to get my head around multiprocessing. The problem is all the examples I\'ve come across don\'t seem to fit my scenario. I\'d like to multiprocess or thread wo
To ping multiple ip addresses concurrently is easy using multiprocessing
:
#!/usr/bin/env python
from multiprocessing.pool import ThreadPool # use threads
from subprocess import check_output
def ping(ip, timeout=10):
cmd = "ping -c4 -n -w {timeout} {ip}".format(**vars())
try:
result = check_output(cmd.split())
except Exception as e:
return ip, None, str(e)
else:
return ip, result, None
pool = ThreadPool(100) # no more than 100 pings at any single time
for ip, result, error in pool.imap_unordered(ping, ip_list):
if error is None: # no error
print(ip) # print ips that have returned 4 packets in timeout seconds
Note: I've used ThreadPool
here as a convient way to limit number of concurrent pings. If you want to do all pings at once then you don't need neither threading nor multiprocessing modules because each ping is already in its own process. See Multiple ping script in Python.
What you are currently doing should pose no problem, but if you want to manually create the processes and then join them later on:
import subprocess
import multiprocessing as mp
# Creating our target function here
def do_work(args):
# dummy function
p = subprocess.check_output(["ping", "-c", "4", ip])
print(p)
# Your ip list
ip_list = ['8.8.8.8', '8.8.4.4']
procs = [] # Will contain references to our processes
for ip in ip_list:
# Creating a new process
p = mp.Process(target=do_work, args=(ip,))
# Appending to procs
procs.append(p)
# starting process
p.start()
# Waiting for other processes to join
for p in procs:
p.join()