pysftp download fails because of client exceeded server's internal buffers

▼魔方 西西 提交于 2019-12-13 21:05:30

问题


I need just to download a file from a python script, so the code is simple:

import pysftp
sftp = pysftp.Connection('test_host','test',password='test')
sftp.get('testfile', 'c:\\tmp\\testfile3')

The download will start and proceed for a few seconds at normal speed and then stall. Nothing more is downloaded and after a few minutes the connection is closed by the server. The Admin of the server I am connecting to won't disclose any details of the server for 'security reasons', but wrote me that on his end, he can see an error like this:

SFTP error sending, too many simultaneous client requests. Client has exceeded the server's internal buffers.

Is there anything I can do about it from the python code?


回答1:


This one is another manifestation of the MAX_REQUEST_SIZE problem described in a post here: Paramiko Fails to download large files >1GB

So I too changed in sftp_file.py :

MAX_REQUEST_SIZE = 32768

to

MAX_REQUEST_SIZE = 1024

and magically, the problem seems to be solved.



来源:https://stackoverflow.com/questions/29628133/pysftp-download-fails-because-of-client-exceeded-servers-internal-buffers

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!