Opening many small files on NTFS is way too slow

后端 未结 5 1296
我寻月下人不归
我寻月下人不归 2021-02-14 14:53

I am writing a program that should process many small files, say thousands or even millions. I\'ve been testing that part on 500k files, and the first step was just to iterate a

5条回答
  •  死守一世寂寞
    2021-02-14 15:38

    NTFS is slow with large number of files. Especially if they are in the same directory. When they are divided in separate dirs and subdirs, the access is faster. I have experience with many files stored by video camera board (4 cameras) and it was too slow even to see the number of files and size (Properties on root folder). It is interesting that when the disk is FAT32, the same is much faster. And all sources say that NTFS is faster... Maybe is faster for reading of single file, but directory operations are slower.

    Why you need so many files? I hope directory indexing service is enabled.

提交回复
热议问题