Processing Simultaneous/Asynchronous Requests with Python BaseHTTPServer

前端 未结 1 553
醉酒成梦
醉酒成梦 2021-02-08 04:47

I\'ve set up a threaded (with Python threads) HTTP server by creating a class that inherits from HTTPServer and ThreadingMixIn:

class ThreadedHTTPServer(Threadin         


        
1条回答
  •  再見小時候
    2021-02-08 04:58

    class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
        pass
    

    is enough. Your client probably don't make concurrent requests. If you make the requests in parallel the threaded server works as expected. Here's the client:

    #!/usr/bin/env python
    import sys
    import urllib2
    
    from threading import Thread
    
    def make_request(url):
        print urllib2.urlopen(url).read()
    
    def main():
        port = int(sys.argv[1]) if len(sys.argv) > 1 else 8000
        for _ in range(10):
            Thread(target=make_request, args=("http://localhost:%d" % port,)).start()
    
    main()
    

    And the corresponding server:

    import time
    from BaseHTTPServer   import BaseHTTPRequestHandler, HTTPServer, test as _test
    from SocketServer     import ThreadingMixIn
    
    
    class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
        pass
    
    class SlowHandler(BaseHTTPRequestHandler):
        def do_GET(self):
            self.send_response(200)
            self.send_header("Content-type", "text/plain")
            self.end_headers()
    
            self.wfile.write("Entered GET request handler")
            time.sleep(1)
            self.wfile.write("Sending response!")
    
    def test(HandlerClass = SlowHandler,
             ServerClass = ThreadedHTTPServer):
        _test(HandlerClass, ServerClass)
    
    
    if __name__ == '__main__':
        test()
    

    All 10 requests finish in 1 second. If you remove ThreadingMixIn from the server definition then all 10 requests take 10 seconds to complete.

    0 讨论(0)
提交回复
热议问题