How can I recursively count files in a Linux directory?
I found this:
find DIR_NAME -type f ¦ wc -l
But when I run this it returns
On my computer, rsync
is a little bit faster than find | wc -l
in the accepted answer:
$ rsync --stats --dry-run -ax /path/to/dir /tmp
Number of files: 173076
Number of files transferred: 150481
Total file size: 8414946241 bytes
Total transferred file size: 8414932602 bytes
The second line has the number of files, 150,481 in the above example. As a bonus you get the total size as well (in bytes).
Remarks:
--dry-run
(or -n
for short) option is important to not actually transfer the files!-x
option to "don't cross filesystem boundaries", which means if you execute it for /
and you have external hard disks attached, it will only count the files on the root partition.If what you need is to count a specific file type recursively, you can do:
find YOUR_PATH -name '*.html' -type f | wc -l
-l
is just to display the number of lines in the output.
If you need to exclude certain folders, use -not -path
find . -not -path './node_modules/*' -name '*.js' -type f | wc -l
With bash:
Create an array of entries with ( ) and get the count with #.
FILES=(./*); echo ${#FILES[@]}
Ok that doesn't recursively count files but I wanted to show the simple option first. A common use case might be for creating rollover backups of a file. This will create logfile.1, logfile.2, logfile.3 etc.
CNT=(./logfile*); mv logfile logfile.${#CNT[@]}
Recursive count with bash 4+ globstar
enabled (as mentioned by @tripleee)
FILES=(**/*); echo ${#FILES[@]}
To get the count of files recursively we can still use find in the same way.
FILES=(`find . -type f`); echo ${#FILES[@]}
This alternate approach with filtering for format counts all available grub kernel modules:
ls -l /boot/grub/*.mod | wc -l
tree $DIR_PATH | tail -1
Sample Output:
5309 directories, 2122 files
find -type f | wc -l
OR (If directory is current directory)
find . -type f | wc -l