If I just browse some pages on the app, it sits at around 500MB. Many of these pages access the database but at this point in time, I only have roughly a couple of rows each for
I suggest trying Ionic.Zip library. I use it in one of our sites with a requirement to download multiple files into one unit.
I recently tested it with a group of files while one of the files is as large as 600MB:
You may need to read the data in chunks and write to the output stream. Take a look at SqlDataReader.GetBytes http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqldatareader.getbytes(v=vs.110).aspx
Because the file is large, it is allocated on the Large Object Heap, which is collected with a gen2 collection (which you see in your profile, the purple blocks is the large object heap, and you see it collected after 10 seconds).
On your production server, you most likely have much more memory than on your local machine. Because there is less memory pressure, the collections won't occur as frequently, which explains why it would add up to a higher number - there are several files on the LOH before it gets collected.
I wouldn't be surprised at all if, across different buffers in MVC and EF, some data gets copied around in unsafe blocks too, which explains the unmanaged memory growth (the thin spike for EF, the wide plateau for MVC)
Finally, a 500MB baseline is for a large project not completely surprising (madness! but true!)
So an answer to your question why it uses so much memory that is quite probable is "because it can", or in other words, because there is no memory pressure to perform a gen2 collection, and the downloaded files sit unused in your large object heap until collection evicts them because memory is abundant on your production server.
This is probably not even a real problem: if there were more memory pressure, there would be more collection, and you'd see lower memory usage.
As for what to do about it, I'm afraid you're out of luck with the Entity Framework. As far as I know it has no streaming API. WebAPI does allow streaming the response by the way, but that won't help you much if you have the whole large object sitting in memory anyway (though it might possibly help some with the unmanaged memory in the (by me) unexplored parts of MVC.
This could be one of a few things:
As your file is rather large and is stored in your database and you are getting it via Entity Framework
, you are caching this data in a few places. Each EF
request caches that data until your context is disposed. When you return the file from the action, the data is then loaded again and then streamed to the client. All of this happens in ASP .NET
as explained already.
A solution to this issue to not to stream large files directly from the database with EF
and ASP .NET
. A better solution is to use a background process to cache large files locally to the website and then have the client download them with a direct URL. This allows IIS
to manage the streaming, saves your website a request and saves a lot of memory.
OR (less likely)
Seeing that you are using Visual Studio 2013
, this sounds awfully like a Page Inspector
issue.
What happens is when you run your website with IIS Express
from Visual Studio
, Page Inspector
caches all of the response data - including that of your file - causing a lot of memory to be used. Try adding:
<appSettings>
<add key="PageInspector:ServerCodeMappingSupport" value="Disabled" />
</appSettings>
to your web.config
to disable Page Inspector
to see if that helps.
TL;DR
Cache the large file locally and let the client download the file directly. Let IIS handle the hard work for you.
Apparently, that consists of System.Web and all it's children taking up around 200MB. This is quoted as the absolute minimum for your application pool.
Our web application using EF 6, with a model consisting of 220+ entities in .Net 4.0 starts up at around 480MB idle. We perform some AutoMapper operations at startup. Memory consumption peaks and then returns to around 500MB in daily use. We've just accepted this as the norm.
Now, for your file download spikes. The issue under web forms when using an ashx handler or the like was explored in this question: ASP.net memory usage during download
I don't know how that relates to the FileActionResult in MVC, but you can see that the buffer size needed to be controlled manually to minimise the memory spike. Try to apply the principles behind the answer from that question by:
Response.BufferOutput = false;
var stream = new MemoryStream(file);
stream.Position = 0;
return new FileStreamResult(stream, type); // Or just pass the "file" parameter as a stream
After applying this change, what does the memory behaviour look like?
See 'Debugging memory problems (MSDN)' for more details.
Add a GC.Collect() to the Dispose method for testing purposes. If the leak stays it is a real leak. If it vanishes it was just delayed GC.
You did that and said:
@usr Memory usage now hardly reaches 600MB. So really just delayed?
Clearly, there is no memory leak if GC.Collect removes the memory that you were worried about. If you want to make really sure, run your test 10 times. Memory usage should be stable.
Processing such big files in single chunks can lead to multiplied memory usage as the file travels through the different components and frameworks. It can be a good idea to switch to a streaming approach.