How to deliver big files in ASP.NET Response?

前端 未结 4 974
慢半拍i
慢半拍i 2020-11-28 03:28

I am not looking for any alternative of streaming file contents from database, indeed I am looking for root of the problem, this was running file till IIS

相关标签:
4条回答
  • 2020-11-28 03:32

    This piece of code works for me. It starts the data stream to client immediately. It shows progress during download. It doesn't violate HTTP. Content-Length header is specified and the chuncked transfer encoding is not used.

    protected void PrepareResponseStream(string clientFileName, HttpContext context, long sourceStreamLength)
    {
        context.Response.ClearHeaders();
        context.Response.Clear();
    
        context.Response.ContentType = "application/pdf";
        context.Response.AddHeader("Content-Disposition", string.Format("filename=\"{0}\"", clientFileName));
    
        //set cachebility to private to allow IE to download it via HTTPS. Otherwise it might refuse it
        //see reason for HttpCacheability.Private at http://support.microsoft.com/kb/812935
        context.Response.Cache.SetCacheability(HttpCacheability.Private);
        context.Response.Buffer = false;
        context.Response.BufferOutput = false;
        context.Response.AddHeader("Content-Length", sourceStreamLength.ToString    (System.Globalization.CultureInfo.InvariantCulture));
    }
    
    protected void WriteDataToOutputStream(Stream sourceStream, long sourceStreamLength, string clientFileName, HttpContext context)
    {
        PrepareResponseStream(clientFileName, context, sourceStreamLength);
        const int BlockSize = 4 * 1024 * 1024;
        byte[] buffer = new byte[BlockSize];
        int bytesRead;
        Stream outStream = m_Context.Response.OutputStream;
        while ((bytesRead = sourceStream.Read(buffer, 0, BlockSize)) > 0)
        {
            outStream.Write(buffer, 0, bytesRead);
        }
        outStream.Flush();
    }
    
    0 讨论(0)
  • 2020-11-28 03:36

    What I would do is use the not so well-known ASP.NET Response.TransmitFile method, as it's very fast (and possibly uses IIS kernel cache) and takes care of all header stuff. It is based on the Windows unmanaged TransmitFile API.

    But to be able to use this API, you need a physical file to transfer. So here is a pseudo c# code that explain how to do this with a fictional myCacheFilePath physical file path. It also supports client caching possibilities. Of course, if you already have a file at hand, you don't need to create that cache:

        if (!File.Exists(myCacheFilePath))
        {
            LoadMyCache(...); // saves the file to disk. don't do this if your source is already a physical file (not stored in a db for example).
        }
    
        // we suppose user-agent (browser) cache is enabled
        // check appropriate If-Modified-Since header
        DateTime ifModifiedSince = DateTime.MaxValue;
        string ifm = context.Request.Headers["If-Modified-Since"];
        if (!string.IsNullOrEmpty(ifm))
        {
            try
            {
                ifModifiedSince = DateTime.Parse(ifm, DateTimeFormatInfo.InvariantInfo);
            }
            catch
            {
                // do nothing
            }
    
            // file has not changed, just send this information but truncate milliseconds
            if (ifModifiedSince == TruncateMilliseconds(File.GetLastWriteTime(myCacheFilePath)))
            {
                ResponseWriteNotModified(...); // HTTP 304
                return;
            }
        }
    
        Response.ContentType = contentType; // set your file content type here
        Response.AddHeader("Last-Modified", File.GetLastWriteTimeUtc(myCacheFilePath).ToString("r", DateTimeFormatInfo.InvariantInfo)); // tell the client to cache that file
    
        // this API uses windows lower levels directly and is not memory/cpu intensive on Windows platform to send one file. It also caches files in the kernel.
        Response.TransmitFile(myCacheFilePath)
    
    0 讨论(0)
  • 2020-11-28 03:42

    Although correct way to deliver the big files in IIS is the following option,

    1. Set MinBytesPerSecond to Zero in WebLimits (This will certainly help in improving performance, as IIS chooses to close clients holding KeepAlive connections with smaller size transfers)
    2. Allocate More Worker Process to Application Pool, I have set to 8, now this should be done only if your server is distributing larger files. This will certainly cause other sites to perform slower, but this will ensure better deliveries. We have set to 8 as this server has only one website and it just delivers huge files.
    3. Turn off App Pool Recycling
    4. Turn off Sessions
    5. Leave Buffering On
    6. Before each of following steps, check if Response.IsClientConnected is true, else give up and dont send anything.
    7. Set Content-Length before sending the file
    8. Flush the Response
    9. Write to Output Stream, and Flush in regular intervals
    0 讨论(0)
  • 2020-11-28 03:46

    When you have set the content length with the bufferOutput to false then the possible reason of the fails is because IIS try to gzip the file you send, and by set the Content-Length IIS can not change it back to the compressed one, and the errors starts (*).

    So keep the BufferOutput to false, and second disable the gzip from iis for the files you send - or disable the iis gzip for all files and you handle the gzip part programmatically, keeping out of gzip the files you send.

    Some similar questions for the same reason: ASP.NET site sometimes freezing up and/or showing odd text at top of the page while loading, on load balanced servers

    HTTP Compression: Some external scripts/CSS not decompressing properly some of the time

    (*) why not change it again ? because from the moment you set a header you can not take it back, except if you have enable this option on IIS and take care that the header have not all ready send to the browser.

    Follow up

    If not gziped, the next thing it came to my mind is that the file is sent and for some reason the connection got delayed, and got a timeout and closed. So you get the "Remote Host Closed The Connection".

    This can be solved depending on the cause:

    1. Client really closed the connection
    2. The timeout is from the page itself, if you use handler (again, probably, the message must be "Page Timed Out" ).
    3. The timeout is coming from the idle waiting, the page take more than the execution time, gets a timeout and close the connection. Maybe in this case the message was the Page Timed Out.
    4. The pool make a recycle the moment you send the file. Disable all pool recycles! This is the most possible cases that I can think of right now.

    If it is coming from the IIS, go to the web site properties and make sure you set the biggest "Connection Timeout", and "Enable HTTP Keep-Alives".

    The page timeout by changing the web.config (you can change it programmatically only for one specific page)

    <httpRuntime executionTimeout="43200"
    

    Also have a look at : http://weblogs.asp.net/aghausman/archive/2009/02/20/prevent-request-timeout-in-asp-net.aspx

    Session lock

    One more thing that you need to examine is to not use session on the handler that you use to send the file, because the session locks the action until finish out and if a user take longer time to download a file, a second one may get time out.

    some relative:

    call aspx page to return an image randomly slow

    Replacing ASP.Net's session entirely

    Response.WriteFile function fails and gives 504 gateway time-out

    0 讨论(0)
提交回复
热议问题