According to this Microsoft article, the lookup time of a directory increases proportional to the square of the number of entries. (Although that was a bug against NT 3.5.)
A similar question was asked on the Old Joel on Software Forum. One answer was that performance seems to drop between 1000 and 3000 files, and one poster hit a hard limit at 18000 files. Still another post claims that 300,000 files are possible but search times decrease rapidly as all the 8.3 filenames are used up.
To avoid large directories, create one, two or more levels of subdirectories and hash the files into those. The simplest kind of hash uses the letters of the filename. So a file starting abc0001.txt would be placed as a\b\c\abc0001.txt, assuming you chose 3 levels of nesting. 3 is probably overkill - using two characters per directory reduces the number of nesting levels. e.g. ab\abc0001.txt
. You will only need to go to two levels of nesting if you anticipate that any directory will have vastly more than ca. 3000 files.