问题
I need just to download a file from a python script, so the code is simple:
import pysftp
sftp = pysftp.Connection('test_host','test',password='test')
sftp.get('testfile', 'c:\\tmp\\testfile3')
The download will start and proceed for a few seconds at normal speed and then stall. Nothing more is downloaded and after a few minutes the connection is closed by the server. The Admin of the server I am connecting to won't disclose any details of the server for 'security reasons', but wrote me that on his end, he can see an error like this:
SFTP error sending, too many simultaneous client requests. Client has exceeded the server's internal buffers.
Is there anything I can do about it from the python code?
回答1:
This one is another manifestation of the MAX_REQUEST_SIZE problem described in a post here: Paramiko Fails to download large files >1GB
So I too changed in sftp_file.py :
MAX_REQUEST_SIZE = 32768
to
MAX_REQUEST_SIZE = 1024
and magically, the problem seems to be solved.
来源:https://stackoverflow.com/questions/29628133/pysftp-download-fails-because-of-client-exceeded-servers-internal-buffers