Argument list too long error for rm, cp, mv commands

前端 未结 27 2437
长情又很酷
长情又很酷 2020-11-22 04:50

I have several hundred PDFs under a directory in UNIX. The names of the PDFs are really long (approx. 60 chars).

When I try to delete all PDFs together using the fol

相关标签:
27条回答
  • 2020-11-22 05:08

    i was facing same problem while copying form source directory to destination

    source directory had files ~3 lakcs

    i used cp with option -r and it's worked for me

    cp -r abc/ def/

    it will copy all files from abc to def without giving warning of Argument list too long

    0 讨论(0)
  • 2020-11-22 05:10

    you can try this:

    for f in *.pdf
    do
      rm "$f"
    done
    

    EDIT: ThiefMaster comment suggest me not to disclose such dangerous practice to young shell's jedis, so I'll add a more "safer" version (for the sake of preserving things when someone has a "-rf . ..pdf" file)

    echo "# Whooooo" > /tmp/dummy.sh
    for f in '*.pdf'
    do
       echo "rm -i \"$f\""
    done >> /tmp/dummy.sh
    

    After running the above, just open the /tmp/dummy.sh file in your favorite editor and check every single line for dangerous filenames, commenting them out if found.

    Then copy the dummy.sh script in your working dir and run it.

    All this for security reasons.

    0 讨论(0)
  • 2020-11-22 05:10

    You could use a bash array:

    files=(*.pdf)
    for((I=0;I<${#files[@]};I+=1000)); do
        rm -f "${files[@]:I:1000}"
    done
    

    This way it will erase in batches of 1000 files per step.

    0 讨论(0)
  • 2020-11-22 05:13

    You can create a temp folder, move all the files and sub-folders you want to keep into the temp folder then delete the old folder and rename the temp folder to the old folder try this example until you are confident to do it live:

    mkdir testit
    cd testit
    mkdir big_folder tmp_folder
    touch big_folder/file1.pdf
    touch big_folder/file2.pdf
    mv big_folder/file1,pdf tmp_folder/
    rm -r big_folder
    mv tmp_folder big_folder
    

    the rm -r big_folder will remove all files in the big_folder no matter how many. You just have to be super careful you first have all the files/folders you want to keep, in this case it was file1.pdf

    0 讨论(0)
  • 2020-11-22 05:16

    I have faced a similar problem when there were millions of useless log files created by an application which filled up all inodes. I resorted to "locate", got all the files "located"d into a text file and then removed them one by one. Took a while but did the job!

    0 讨论(0)
  • 2020-11-22 05:18

    I only know a way around this. The idea is to export that list of pdf files you have into a file. Then split that file into several parts. Then remove pdf files listed in each part.

    ls | grep .pdf > list.txt
    wc -l list.txt
    

    wc -l is to count how many line the list.txt contains. When you have the idea of how long it is, you can decide to split it in half, forth or something. Using split -l command For example, split it in 600 lines each.

    split -l 600 list.txt
    

    this will create a few file named xaa,xab,xac and so on depends on how you split it. Now to "import" each list in those file into command rm, use this:

    rm $(<xaa)
    rm $(<xab)
    rm $(<xac)
    

    Sorry for my bad english.

    0 讨论(0)
提交回复
热议问题