How to find the largest file in a directory and its subdirectories?

前端 未结 15 1306
醉梦人生
醉梦人生 2020-11-28 18:29

We\'re just starting a UNIX class and are learning a variety of Bash commands. Our assignment involves performing various commands on a directory that has a number of folder

相关标签:
15条回答
  • 2020-11-28 19:00

    There is no simple command available to find out the largest files/directories on a Linux/UNIX/BSD filesystem. However, combination of following three commands (using pipes) you can easily find out list of largest files:

    # du -a /var | sort -n -r | head -n 10
    

    If you want more human readable output try:

    $ cd /path/to/some/var
    $ du -hsx * | sort -rh | head -10
    

    Where,

    • Var is the directory you wan to search
    • du command -h option : display sizes in human readable format (e.g., 1K, 234M, 2G).
    • du command -s option : show only a total for each argument (summary).
    • du command -x option : skip directories on different file systems.
    • sort command -r option : reverse the result of comparisons.
    • sort command -h option : compare human readable numbers. This is GNU sort specific option only.
    • head command -10 OR -n 10 option : show the first 10 lines.
    0 讨论(0)
  • 2020-11-28 19:00

    On Solaris I use:

    find . -type f -ls|sort -nr -k7|awk 'NR==1{print $7,$11}' #formatted
    

    or

    find . -type f -ls | sort -nrk7 | head -1 #unformatted
    

    because anything else posted here didn't work. This will find the largest file in $PWD and subdirectories.

    0 讨论(0)
  • 2020-11-28 19:05
    find . -type f | xargs ls -lS | head -n 1
    

    outputs

    -rw-r--r--  1 nneonneo  staff  9274991 Apr 11 02:29 ./devel/misc/test.out
    

    If you just want the filename:

    find . -type f | xargs ls -1S | head -n 1
    

    This avoids using awk and allows you to use whatever flags you want in ls.

    Caveat. Because xargs tries to avoid building overlong command lines, this might fail if you run it on a directory with a lot of files because ls ends up executing more than once. It's not an insurmountable problem (you can collect the head -n 1 output from each ls invocation, and run ls -S again, looping until you have a single file), but it does mar this approach somewhat.

    0 讨论(0)
提交回复
热议问题