fopen problem - too many open files

廉价感情. 提交于 2019-12-23 12:28:57

问题


I have a multithreaded application running on Win XP. At a certain stage one of a threads is failing to open an existing file using fopen function. _get_errno function returns EMFILE which means Too many open files. No more file descriptors are available. FOPEN_MAX for my platform is 20. _getmaxstdio returns 512. I checked this with WinDbg and I see that about 100 files are open:

788 Handles
Type            Count
Event           201
Section         12
File            101
Port            3
Directory       3
Mutant          32
WindowStation   2
Semaphore       351
Key             12
Thread          63
Desktop         1
IoCompletion    6
KeyedEvent      1

What is the reason that fopen fails ?


EDIT:

I wrote simple single threaded test application. This app can open 510 files. I don't understand why this app can open more files then multithreaded app. Can it be because of file handle leaks ?

#include <cstdio> 
#include <cassert> 
#include <cerrno> 
void main() 
{ 
    int counter(0); 

    while (true) 
    { 
        char buffer[256] = {0}; 
        sprintf(buffer, "C:\\temp\\abc\\abc%d.txt", counter++); 
        FILE* hFile = fopen(buffer, "wb+"); 
        if (0 == hFile) 
        { 
            // check error code 
            int err(0); 
            errno_t ret = _get_errno(&err); 
            assert(0 == ret); 
            int maxAllowed = _getmaxstdio(); 
            assert(hFile); 
        } 
    } 
}

回答1:


I think in win32 all the crt function will finally endup using the win32 api underneath. So in this case most probably it must be using CreateFile/OpenFile of win32. Now CreatFile/OpenFile api is not meant only for files (Files,Directories,Communication Ports,pipes,mail slots,Drive volumes etc.,). So in a real application depending on the number these resources your max open file may vary. Since you have not described much about the application. This is my first guess. If time permits go through this http://blogs.technet.com/b/markrussinovich/archive/2009/09/29/3283844.aspx




回答2:


I guess this is a limitation of your operating system. It can depend on many things: the way the file descriptors are represented, the memory they consume, and so on.

And I suppose there isn't much you can do about it. Perhaps there is some parameter to tweak that limit.

The real question is, do you really need to open that much files simultaneously ? I mean, even if you have 100+ threads trying to read 100+ different files, they probably wont be able to read them at the same time, and you'll probably not get any better result than having, as an example, 50 threads.

It's difficult to be more accurate since we don't know what you try to achieve.



来源:https://stackoverflow.com/questions/3184345/fopen-problem-too-many-open-files

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!