pysftp

Recursive download with pysftp

穿精又带淫゛_ 提交于 2021-01-24 07:25:05
问题 I'm trying to fetch from SFTP with the following structure: main_dir/ dir1/ file1 dir2/ file2 I tried to achieve this with commands below: sftp.get_r(main_path + dirpath, local_path) or sftp.get_d(main_path + dirpath, local_path) The local path is like d:/grabbed_files/target_dir , and the remote is like /data/some_dir/target_dir . With get_r I am getting FileNotFound exception. With get_d I am getting empty dir (when target dir have files not dirs, it works fine). I'm totally sure that

Recursive download with pysftp

风流意气都作罢 提交于 2021-01-24 07:24:34
问题 I'm trying to fetch from SFTP with the following structure: main_dir/ dir1/ file1 dir2/ file2 I tried to achieve this with commands below: sftp.get_r(main_path + dirpath, local_path) or sftp.get_d(main_path + dirpath, local_path) The local path is like d:/grabbed_files/target_dir , and the remote is like /data/some_dir/target_dir . With get_r I am getting FileNotFound exception. With get_d I am getting empty dir (when target dir have files not dirs, it works fine). I'm totally sure that

Slow upload of many small files with SFTP

家住魔仙堡 提交于 2020-12-30 03:01:11
问题 When uploading 100 files of 100 bytes each with SFTP, it takes 17 seconds here ( after the connection is established, I don't even count the initial connection time). This means it's 17 seconds to transfer 10 KB only, i.e. 0.59 KB/sec! I know that sending SSH commands to open , write , close , etc. probably creates a big overhead, but still, is there a way to speed up the process when sending many small files with SFTP? Or a special mode in paramiko / pysftp to keep all the writes operations

Slow upload of many small files with SFTP

老子叫甜甜 提交于 2020-12-30 03:01:05
问题 When uploading 100 files of 100 bytes each with SFTP, it takes 17 seconds here ( after the connection is established, I don't even count the initial connection time). This means it's 17 seconds to transfer 10 KB only, i.e. 0.59 KB/sec! I know that sending SSH commands to open , write , close , etc. probably creates a big overhead, but still, is there a way to speed up the process when sending many small files with SFTP? Or a special mode in paramiko / pysftp to keep all the writes operations

Slow upload of many small files with SFTP

核能气质少年 提交于 2020-12-30 03:00:21
问题 When uploading 100 files of 100 bytes each with SFTP, it takes 17 seconds here ( after the connection is established, I don't even count the initial connection time). This means it's 17 seconds to transfer 10 KB only, i.e. 0.59 KB/sec! I know that sending SSH commands to open , write , close , etc. probably creates a big overhead, but still, is there a way to speed up the process when sending many small files with SFTP? Or a special mode in paramiko / pysftp to keep all the writes operations

“IOError: size mismatch in get!” when retrieving files via SFTP

匆匆过客 提交于 2020-11-29 03:57:29
问题 I have a script which I use to retrieve specific files via SFTP on a regular basis. On occasion, the script will error out with the following output: Traceback (most recent call last): File "ETL.py", line 304, in <module> get_all_files(startdate, enddate, "vma" + foldernumber + "/logs/", txtype[1] + single_date2 + ".log", txtype[2] + foldernumber + "\\", sftp) File "ETL.py", line 283, in get_all_files sftp.get(sftp_dir + filename, local_dir + filename) File "C:\Python27\lib\site-packages