问题
I've got a code base with lots of this:
byte[] contents = FileUtils.FileToByteArray(FileOfGartantuanProportions);
I don't control my IIS server, so I can't see the system log or do instrumentation, I just get to see my request fail to return (white page of death) and sometimes YSOD with Out of Memory error.
Does anyone have a rule of thumb for what is the most data you can load up into memory before IIS5 or IIS6 will kill the work process or just keel over and die?
Or better yet, is there an API call I an make, something like:
if(!IsEnoughMemoryFor(FileOfGartantuanProportion.Length)) throw new SomeException() ;
On my XP Pro workstation I can get get an ASP.NET page to successfully deal with a very large byte array in memory, but these results obviously weren't applicable to a real shared server.
回答1:
According to Tess Ferrandez's talk at TechEd, you can start seeing Out Of Memory exceptions on a 32bit server when you have about 800MB in Private Bytes or 1.4GB in Virtual Bytes.
She also had a good post about why this is here:
A restaurant analogy
Other points she made included thinking about what you're serialising into session - for example serialising a 1MB dataset can result in 15-20MB of memory used on the server every page as that data is serialised and de-serialised.
回答2:
With IIS6 in native mode you can configure the limits for each Application Pool.
With IIS5 it's configured using the element in machine.config as a percentage of total system memory - default is 60%.
回答3:
For IIS 6, you'll likely run into the memory recycling limits PeriodicRestartPrivateMemory and PeriodicRestartMemory. I think on XP, it's 60% of physical memory. At least that's what I remember about asp.net 1.1, I'm not sure about 2.0
The YSOD is probably best handled with try/catch around the large allocations.
来源:https://stackoverflow.com/questions/844050/rule-of-thumb-for-amount-of-usage-memory-it-takes-to-make-a-worker-process-recyc