total size of group of files selected with 'find'

一个人想着一个人 提交于 2019-12-28 08:07:23

问题


For instance, I have a large filesystem that is filling up faster than I expected. So I look for what's being added:

find /rapidly_shrinking_drive/ -type f -mtime -1 -ls | less

And I find, well, lots of stuff. Thousands of files of six-seven types. I can single out a type and count them:

find /rapidly_shrinking_drive/ -name "*offender1*" -mtime -1 -ls | wc -l

but what I'd really like is to be able to get the total size on disk of these files:

find /rapidly_shrinking_drive/ -name "*offender1*" -mtime -1 | howmuchspace

I'm open to a Perl one-liner for this, if someone's got one, but I'm not going to use any solution that involves a multi-line script, or File::Find.


回答1:


The command du tells you about disk usage. Example usage for your specific case:

find rapidly_shrinking_drive/ -name "offender1" -mtime -1 -print0 | du --files0-from=- -hc | tail -n1

(Previously I wrote du -hs, but on my machine that appears to disregard find's input and instead summarises the size of the cwd.)




回答2:


Darn, Stephan202 is right. I didn't think about du -s (summarize), so instead I used awk:

find rapidly_shrinking_drive/ -name "offender1" -mtime -1 | du | awk '{total+=$1} END{print total}'

I like the other answer better though, and it's almost certainly more efficient.




回答3:


with GNU find,

 find /path -name "offender" -printf "%s\n" | awk '{t+=$1}END{print t}'



回答4:


I'd like to promote jason's comment above to the status of answer, because I believe it's the most mnemonic (though not the most generic, if you really gotta have the file list specified by find):

$ du -hs *.nc
6.1M  foo.nc
280K  foo_region_N2O.nc
8.0K  foo_region_PS.nc
844K  foo_region_xyz.nc
844K  foo_region_z.nc
37M   ETOPO1_Ice_g_gmt4.grd_region_zS.nc
$ du -ch *.nc | tail -n 1
45M total
$ du -cb *.nc | tail -n 1
47033368  total



回答5:


I have tried all this commands but no luck. So I have found this one that gives me an answer:

find . -type f -mtime -30 -exec ls -l {} \; | awk '{ s+=$5 } END { print s }'



回答6:


You could also use ls -l to find their size, then awk to extract the size:

find /rapidly_shrinking_drive/ -name "offender1" -mtime -1 | ls -l | awk '{print $5}' | sum


来源:https://stackoverflow.com/questions/1134245/total-size-of-group-of-files-selected-with-find

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!