Fast Linux File Count for a large number of files
I'm trying to figure out the best way to find the number of files in a particular directory when there are a very large number of files ( > 100,000). When there are that many files, performing "ls | wc -l" takes quite a long time to execute. I believe this is because it's returning the names of all the files. I'm trying to take up as little of the disk IO as possible. I have experimented with some shell and Perl scripts to no avail. Any ideas? By default ls sorts the names, which can take a while if there are a lot of them. Also there will be no output until all of the names are read and