问题
It seems simply importing ipdb
when making an http request wrapped in a multiprocessing Process
instance causes the program to exit with no errors or messages.
The following script behaves very strangely:
from multiprocessing import Process
import requests
import ipdb
def spawn():
print("before")
r = requests.get("http://wtfismyip.com")
print("after")
Process(target=spawn).start()
If you run this in terminal the output is simply before
and you are back at your prompt. If you comment out import ipdb
everything is fine and the request is successfully made.
- Storing the
Process
instance in a variable and callingjoin()
afterstart()
didn't make a difference. - This happens in both Python 2.7.10 and 3.5.0.
- It does not happen with the traditional
pdb
. - Other people here and here have also had this issue. In the former I am not sure if importing
ipdb
was the cause. In the latter it appeared to be an package/python version upgrade issue, but I checked that myiPython
andipdb
are the current latest (4.0.0 and 0.8.1).
Can anyone explain why this is happening?
来源:https://stackoverflow.com/questions/33877491/python-multiprocessing-process-is-killed-by-http-request-if-ipdb-is-imported