问题
I am trying to make multiprocessing and socket programming work together, but, I am stuck at this point. Problem is that, I am getting this error:
File "multiprocesssockserv.py", line 11, in worker
clientsocket = socket.fromfd(clientfileno, socket.AF_INET, socket.SOCK_STREAM)
error: [Errno 9] Bad file descriptor
Complete code that causing the error is as following:
import multiprocessing as mp
import logging
import socket
logger = mp.log_to_stderr(logging.WARN)
def worker(queue):
while True:
clientfileno = queue.get()
print clientfileno
clientsocket = socket.fromfd(clientfileno, socket.AF_INET, socket.SOCK_STREAM)
clientsocket.recv()
clientsocket.send("Hello World")
clientsocket.close()
if __name__ == '__main__':
num_workers = 5
socket_queue = mp.Queue()
workers = [mp.Process(target=worker, args=(socket_queue,)) for i in
range(num_workers)]
for p in workers:
p.daemon = True
p.start()
serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
serversocket.bind(('',9090))
serversocket.listen(5)
while True:
client, address = serversocket.accept()
socket_queue.put(client.fileno())
edit: I am using socket.fromfd because I can't put sockets into a queue :) I need a way to access same sockets from different processes somehow. That is the core of my problem.
回答1:
After working on this for a while, I decided to approach this problem from a different angle, and following method seems to be working for me.
import multiprocessing as mp
import logging
import socket
import time
logger = mp.log_to_stderr(logging.DEBUG)
def worker(socket):
while True:
client, address = socket.accept()
logger.debug("{u} connected".format(u=address))
client.send("OK")
client.close()
if __name__ == '__main__':
num_workers = 5
serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
serversocket.bind(('',9090))
serversocket.listen(5)
workers = [mp.Process(target=worker, args=(serversocket,)) for i in
range(num_workers)]
for p in workers:
p.daemon = True
p.start()
while True:
try:
time.sleep(10)
except:
break
回答2:
I'm not an expert so I can't give the real explanation but if you want to use queues, you need to reduce the handle and then recreate it:
in your main :
client, address = serversocket.accept()
client_handle = multiprocessing.reduction.reduce_handle(client.fileno())
socket_queue.put(client_handle)
and in your worker:
clientHandle = queue.get()
file_descriptor = multiprocessing.reduction.rebuild_handle(client_handle)
clientsocket = socket.fromfd(file_descriptor, socket.AF_INET, socket.SOCK_STREAM)
also
import multiprocessing.reduction
That will work with your original code. However, I am currently having problems with closing sockets in worker processes after they were created as I described.
回答3:
Here is some working code on what's mentioned above - https://gist.github.com/sunilmallya/4662837 multiprocessing.reduction socket server with parent processing passing connections to client after accepting connections
来源:https://stackoverflow.com/questions/8545307/multiprocessing-and-sockets-in-python