I am writing a program that should process many small files, say thousands or even millions. I\'ve been testing that part on 500k files, and the first step was just to iterate a
NTFS is slow with large number of files. Especially if they are in the same directory. When they are divided in separate dirs and subdirs, the access is faster. I have experience with many files stored by video camera board (4 cameras) and it was too slow even to see the number of files and size (Properties on root folder). It is interesting that when the disk is FAT32, the same is much faster. And all sources say that NTFS is faster... Maybe is faster for reading of single file, but directory operations are slower.
Why you need so many files? I hope directory indexing service is enabled.