Argument list too long error for rm, cp, mv commands

前端 未结 27 2509
长情又很酷
长情又很酷 2020-11-22 04:50

I have several hundred PDFs under a directory in UNIX. The names of the PDFs are really long (approx. 60 chars).

When I try to delete all PDFs together using the fol

相关标签:
27条回答
  • 2020-11-22 05:25

    I found that for extremely large lists of files (>1e6), these answers were too slow. Here is a solution using parallel processing in python. I know, I know, this isn't linux... but nothing else here worked.

    (This saved me hours)

    # delete files
    import os as os
    import glob
    import multiprocessing as mp
    
    directory = r'your/directory'
    os.chdir(directory)
    
    
    files_names = [i for i in glob.glob('*.{}'.format('pdf'))]
    
    # report errors from pool
    
    def callback_error(result):
        print('error', result)
    
    # delete file using system command
    def delete_files(file_name):
         os.system('rm -rf ' + file_name)
    
    pool = mp.Pool(12)  
    # or use pool = mp.Pool(mp.cpu_count())
    
    
    if __name__ == '__main__':
        for file_name in files_names:
            print(file_name)
            pool.apply_async(delete_files,[file_name], error_callback=callback_error)
    
    0 讨论(0)
  • 2020-11-22 05:27

    I'm surprised there are no ulimit answers here. Every time I have this problem I end up here or here. I understand this solution has limitations but ulimit -s 65536 seems to often do the trick for me.

    0 讨论(0)
  • 2020-11-22 05:28

    I had the same problem with a folder full of temporary images that was growing day by day and this command helped me to clear the folder

    find . -name "*.png" -mtime +50 -exec rm {} \;
    

    The difference with the other commands is the mtime parameter that will take only the files older than X days (in the example 50 days)

    Using that multiple times, decreasing on every execution the day range, I was able to remove all the unnecessary files

    0 讨论(0)
提交回复
热议问题