Does bottle handle requests with no concurrency?

后端 未结 1 979
不思量自难忘°
不思量自难忘° 2021-02-06 06:39

At first, I think Bottle will handle requests concurrently, so I wrote test code bellow:

import json
from bottle import Bottle, run, request, response, get, post         


        
相关标签:
1条回答
  • 2021-02-06 07:12

    Concurrency isn't a function of your web framework -- it's a function of the web server you use to serve it. Since Bottle is WSGI-compliant, it means you can serve Bottle apps through any WSGI server:

    • wsgiref (reference server in the Python stdlib) will give you no concurrency.
    • CherryPy dispatches through a thread pool (number of simultaneous requests = number of threads it's using).
    • nginx + uwsgi gives you multiprocess dispatch and multiple threads per process.
    • Gevent gives you lightweight coroutines that, in your use case, can easily achieve C10K+ with very little CPU load (on Linux -- on Windows it can only handle 1024 simultaneous open sockets) if your app is mostly IO- or database-bound.

    The latter two can serve massive numbers of simultaneous connections.

    According to http://bottlepy.org/docs/dev/api.html , when given no specific instructions, bottle.run uses wsgiref to serve your application, which explains why it's only handling one request at once.

    0 讨论(0)
提交回复
热议问题