问题
Using this curl command I am able to get the response I am looking for from Bash
curl -v -u z:secret_key --proxy http://proxy.net:80 \
-H "Content-Type: application/json" https://service.com/data.json
I have already seen this other post on proxies with the Requests module
And it helped me formulate my code in Python but I need to make a request via a proxy. However, even while supplying the proper proxies it isn't working. Perhaps I'm just not seeing something?
>>> requests.request('GET', 'https://service.com/data.json', \
>>> headers={'Content-Type':'application/json'}, \
>>> proxies = {'http' : "http://proxy.net:80",'https':'http://proxy.net:80'}, \
>>> auth=('z', 'secret_key'))
Furthermore, at the same python console I can use urllib to make a request have it be successful.
>>> import urllib
>>> urllib.urlopen("http://www.httpbin.org").read()
---results---
Even trying requests on just a non-https address fails to work.
>>> requests.get('http://www.httpbin.org')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Python/2.6/site-packages/requests/api.py", line 79, in get
return request('get', url, **kwargs)
File "/Library/Python/2.6/site-packages/requests/api.py", line 66, in request
prefetch=prefetch
File "/Library/Python/2.6/site-packages/requests/sessions.py", line 191, in request
r.send(prefetch=prefetch)
File "/Library/Python/2.6/site-packages/requests/models.py", line 454, in send
raise ConnectionError(e)
requests.exceptions.ConnectionError: Max retries exceeded for url:
Requests is so elegant and awesome but how could it be failing in this instance?
回答1:
The problem actually lies with python's standard url access libraries - urllib/urllib2/httplib. I can't remember which library is the exact culprit, but for simplicity's sake, let's just call it urllib. Unfortunately, urllib doesn't implement the HTTP Connect method which is required for accessing an https site through an http(s) proxy. My efforts to add the functionality using urllib have not been successful (it has been a while since I tried). So unfortunately the only option I know to work is to use pycurl for this case.
However, there is a solution which is relatively clean that is almost exactly the same API as python requests, but it uses a pycurl backend instead of the python standard libraries.
The library is called human_curl. I've used it myself and have had great results.
回答2:
Believeing above answer we tried human_curl
human_curl gave errors like Unknown errors, whereas urllib3 gave correct errors like Request Timed out, Max retries exceeded with url.
So, we went back to urllib3, urllib3 is thread-safe. We are happy with urllib3
Only problem now we get it "Max retries exceeded", We cant solve it, Guessing it might be to do with server/proxy, But not sure.
来源:https://stackoverflow.com/questions/8482896/making-http-requests-via-python-requests-module-not-working-via-proxy-where-curl