recursively use scp but excluding some folders

后端 未结 6 635
庸人自扰
庸人自扰 2021-01-30 12:25

Assume there are some folders with these structures

/bench1/1cpu/p_0/image/
/bench1/1cpu/p_0/fl_1/
/bench1/1cpu/p_0/fl_1/
/bench1/1cpu/p_0/fl_1/
/bench1/1cpu/p_0         


        
相关标签:
6条回答
  • 2021-01-30 12:51

    Assuming the simplest option (installing rsync on the remote host) isn't feasible, you can use sshfs to mount the remote locally, and rsync from the mount directory. That way you can use all the options rsync offers, for example --exclude.

    Something like this should do:

    sshfs user@server: sshfsdir
    rsync --recursive --exclude=whatever sshfsdir/path/on/server /where/to/store
    

    Note that the effectiveness of rsync (only transferring changes, not everything) doesn't apply here. This is because for that to work, rsync must read every file's contents to see what has changed. However, as rsync runs only on one host, the whole file must be transferred there (by sshfs). Excluded files should not be transferred, however.

    0 讨论(0)
  • 2021-01-30 12:53

    You can use extended globbing as in the example below:

    #Enable extglob
    shopt -s extglob
    
    cp -rv !(./excludeme/*.jpg) /var/destination
    
    0 讨论(0)
  • 2021-01-30 12:55

    This one works fine for me as the directories structure is not important for me.

    scp -r USER@HOSTNAME:~/bench1/?cpu/p_?/image/ .
    

    Assuming /bench1 is in the home directory of the current user. Also, change USER and HOSTNAME to the real values.

    0 讨论(0)
  • 2021-01-30 13:05

    If you use a pem file to authenticate u can use the following command (which will exclude files with something extension):

    rsync -Lavz -e "ssh -i <full-path-to-pem> -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" --exclude "*.something" --progress <path inside local host> <user>@<host>:<path inside remote host>
    

    The -L means follow links (copy files not links). Use full path to your pem file and not relative.

    Using sshfs is not recommended since it works slowly. Also, the combination of find and scp that was presented above is also a bad idea since it will open a ssh session per file which is too expensive.

    0 讨论(0)
  • 2021-01-30 13:08

    You can specify GLOBIGNORE and use the pattern *

    GLOBIGNORE='ignore1:ignore2' scp -r source/* remoteurl:remoteDir
    

    You may wish to have general rules which you combine or override by using export GLOBIGNORE, but for ad-hoc usage simply the above will do. The : character is used as delimiter for multiple values.

    0 讨论(0)
  • 2021-01-30 13:15

    Although scp supports recursive directory copying with the -r option, it does not support filtering of the files. There are several ways to accomplish your task, but I would probably rely on find, xargs, tar, and ssh instead of scp.

    find . -type d -wholename '*bench*/image' \
    | xargs tar cf - \
    | ssh user@remote tar xf - -C /my/dir
    

    The rsync solution can be made to work, but you are missing some arguments. rsync also needs the r switch to recurse into subdirectories. Also, if you want the same security of scp, you need to do the transfer under ssh. Something like:

    rsync -avr -e "ssh -l user" --exclude 'fl_*' ./bench* remote:/my/dir
    
    0 讨论(0)
提交回复
热议问题