问题
I have a cherrypy web server that needs to be able to receive large files over http post. I have something working at the moment, but it fails once the files being sent gets too big (around 200mb). I'm using curl to send test post requests, and when I try to send a file that's too big, curl spits out "The entity sent with the request exceeds the maximum allowed bytes." Searching around, this seems to be an error from cherrypy.
So I'm guessing that the file being sent needs to be sent in chunks? I tried something with mmap, but I couldn't get it too work. Does the method that handles the file upload need to be able to accept the data in chunks too?
回答1:
I took DirectToDiskFileUpload as a starting point. The changes it makes to handle big uploads are:
server.max_request_body_size
to0
(default 100MB),server.socket_timeout
to60
(default 10s),response.timeout
to3600
(default 300s),- Avoiding double copy by using
tempfile.NamedTemporaryFile
.
There are also some useless actions taken to supposedly avoid holding upload in memory, which disable standard CherryPy body processing and use cgi.FieldStorage
manually instead. It is useless because there is cherrypy._cpreqbody.Part.maxrambytes
.
The threshold of bytes after which point the
Part
will store its data in a file instead of a string. Defaults to 1000, just like thecgi
module in Python's standard library.
I've experimented with the following code (run by Python 2.7.4, CherryPy 3.6) and 1.4GB file. Memory usage (in gnome-system-monitor) never reached out 10MiB. According to the number of bytes actually written to the disk, cat /proc/PID/io
's write_bytes
is almost the size of the file. With standard cherrypy._cpreqbody.Part
and shutil.copyfileobj
it is obviously doubled.
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import tempfile
import cherrypy
config = {
'global' : {
'server.socket_host' : '127.0.0.1',
'server.socket_port' : 8080,
'server.thread_pool' : 8,
# remove any limit on the request body size; cherrypy's default is 100MB
'server.max_request_body_size' : 0,
# increase server socket timeout to 60s; cherrypy's defult is 10s
'server.socket_timeout' : 60
}
}
class NamedPart(cherrypy._cpreqbody.Part):
def make_file(self):
return tempfile.NamedTemporaryFile()
cherrypy._cpreqbody.Entity.part_class = NamedPart
class App:
@cherrypy.expose
def index(self):
return '''<!DOCTYPE html>
<html>
<body>
<form action='upload' method='post' enctype='multipart/form-data'>
File: <input type='file' name='videoFile'/> <br/>
<input type='submit' value='Upload'/>
</form>
</body>
</html>
'''
@cherrypy.config(**{'response.timeout': 3600}) # default is 300s
@cherrypy.expose()
def upload(self, videoFile):
assert isinstance(videoFile, cherrypy._cpreqbody.Part)
destination = os.path.join('/home/user/', videoFile.filename)
# Note that original link will be deleted by tempfile.NamedTemporaryFile
os.link(videoFile.file.name, destination)
# Double copy with standard ``cherrypy._cpreqbody.Part``
#import shutil
#with open(destination, 'wb') as f:
# shutil.copyfileobj(videoFile.file, f)
return 'Okay'
if __name__ == '__main__':
cherrypy.quickstart(App(), '/', config)
回答2:
Huge file uploads always problematic. What would you do when connection closes in the middle of uploading? Use chunked file upload method instead.
来源:https://stackoverflow.com/questions/13002676/python-sending-and-receiving-large-files-over-post-using-cherrypy