问题
I am using Python requests
in celery workers
to make large number of (~10/sec) API calls(includes GET,POST, PUT, DELETE). Each request takes around 5-10s to complete.
I tried running celery workers in eventlet
pool, with 1000 concurrency.
Since requests
are blocking process each concurrent connection is waiting on one request.
How do I make requests
asynchronous?
回答1:
Use eventlet monkey patching to make any pure python library non-blocking.
patch single library
# import requests # instead do this: import eventlet requests = eventlet.import_patched('requests')
packages erequests and grequests could be stripped down to these two lines.
patch everything
import eventlet eventlet.monkey_patch() # must execute as early as possible ... # everything is non-blocking now: import requests, amqp, memcache, paramiko, redis
Update: there is known issue with monkey patching requests library. If you get:
ImportError: cannot import name utils
, then modify import line to
requests = eventlet.import_patched('requests.__init__')
回答2:
from the docs:
there are lots of projects out there that combine Requests with one of Python’s asynchronicity frameworks. Two excellent examples are grequests and requests-futures.
for eventlet specifically you can use erequests.
来源:https://stackoverflow.com/questions/28315657/celery-eventlet-non-blocking-requests