Argument list too long error for rm, cp, mv commands

前端 未结 27 2436
长情又很酷
长情又很酷 2020-11-22 04:50

I have several hundred PDFs under a directory in UNIX. The names of the PDFs are really long (approx. 60 chars).

When I try to delete all PDFs together using the fol

相关标签:
27条回答
  • 2020-11-22 05:19

    The rm command has a limitation of files which you can remove simultaneous.

    One possibility you can remove them using multiple times the rm command bases on your file patterns, like:

    rm -f A*.pdf
    rm -f B*.pdf
    rm -f C*.pdf
    ...
    rm -f *.pdf
    

    You can also remove them through the find command:

    find . -name "*.pdf" -exec rm {} \;
    
    0 讨论(0)
  • 2020-11-22 05:19

    If they are filenames with spaces or special characters, use:

    find -maxdepth 1 -name '*.pdf' -exec rm "{}" \;
    

    This sentence search all files in the current directory (-maxdepth 1) with extension pdf (-name '*.pdf'), and then, delete each one (-exec rm "{}").

    The expression {} replace the name of the file, and, "{}" set the filename as string, including spaces or special characters.

    0 讨论(0)
  • 2020-11-22 05:19

    A bit safer version than using xargs, also not recursive: ls -p | grep -v '/$' | grep '\.pdf$' | while read file; do rm "$file"; done

    Filtering our directories here is a bit unnecessary as 'rm' won't delete it anyway, and it can be removed for simplicity, but why run something that will definitely return error?

    0 讨论(0)
  • 2020-11-22 05:20

    For remove first 100 files:

    rm -rf 'ls | head -100'

    0 讨论(0)
  • 2020-11-22 05:20

    The below option seems simple to this problem. I got this info from some other thread but it helped me.

    for file in /usr/op/data/Software/temp/application/openpages-storage/*; do
        cp "$file" /opt/sw/op-storage/
    done
    

    Just run the above one command and it will do the task.

    0 讨论(0)
  • 2020-11-22 05:22

    Or you can try:

    find . -name '*.pdf' -exec rm -f {} \;
    
    0 讨论(0)
提交回复
热议问题