I have some directories containing test data, typically over 200,000 small (~4k) files per directory.
I am using the following C# code to get the number of files in a di
I had a very similar problem with a directory containing (we think) ~300,000 files.
After messing with lots of methods for speeding up access (all unsuccessful) we solved our access problems by reorganising the directory into something more hierarchical.
We did this by creating directories a-z
, representing the first letter of the file, then sub-directories for each of those, also containing a-z
for the second letter of the file. Then we inserted the files in the related directory
e.g.
gbp32.dat
went in
g/b/gbp32.dat
and re-wrote our file access routines appropriately. This made a massive difference, and it's relatively trivial to do (I think we moved each file using a 10-line Perl script)