ASP.NET (MVC) Outputcache and concurrent requests

最后都变了- 提交于 2019-12-04 02:45:15

Well, it depends upon how you have IIS configured. If you have less than 100 worker threads (let's say, 50), then the "heavy stuff" is done 50 times, crippling your server, and then the remaining 50 requests will be served from cache.

But no, there is no "locking mechanism" on a cached action result; that would be counterproductive, for the most part.

Edit: I believe this to be true, but Nick's tests say otherwise, and I don't have time to test now. Try it yourself! The rest of the answer is not dependent on the above, though, and I think it's more important.

Generally speaking, however, no web request, cached or otherwise, should take 10 seconds to return. If I were in your shoes, I would look at somehow pre-computing the hard part of the request. You can still cache the action result if you want to cache the HTML, but it sounds like your problem is somewhat bigger than that.

You might also want to consider asynchronous controllers. Finally, note that although IIS and ASP.NET MVC will not lock on this heavy computation, you could. If you use asynchronous controllers combined with a lock on the computation, then you would get effectively the behavior you're asking for. I can't really say if that's the best solution without knowing more about what your doing.

It seems to lock here, doing a simple test:

<%@ OutputCache Duration="10" VaryByParam="*" %>

protected void Page_Load(object sender, EventArgs e)
{
    System.Threading.Thread.Sleep(new Random().Next(1000, 30000));
}

The first page hits the a breakpoint there, even though it's left sleeping...no other request hits a breakpoint in the Page_Load method...it waits for the first one to complete and returns that result to everyone who's requested that page.

Note: this was simpler to test in a webforms scenario, but given this is a shared aspect of the frameworks, you can do the same test in MVC with the same result.

Here's an alternative way to test:

<asp:Literal ID="litCount" runat="server" />

public static int Count = 0;

protected void Page_Load(object sender, EventArgs e)
{
  litCount.Text = Count++.ToString();
  System.Threading.Thread.Sleep(10000);
}

All pages queued up while the first request goes to sleep will have the same count output.

Old question, but I ran in to this problem, and did some investigation.

Example code:

public static int Count;
[OutputCache(Duration = 20, VaryByParam = "*")]
public ActionResult Test()
{
    var i = Int32.MaxValue;
    System.Threading.Thread.Sleep(4000);
    return Content(Count++);
}

Run it in one browser, and it seems to lock and wait.

Run it in different browsers (I tested in IE and firefox) and the requests are not put on hold.

So the "correct" behaviour has more to do with which browser you are using than the function in IIS.

Edit: To clarify - No lock. The server gets hit by all requests that manage to get in before the first result is cached, possibly resulting in a hard hit on the server for heavy requests. (Or if you call an external system, that system could be brought down if your server serves many requests...)

I made a small test that might help. I believe what I've discovered is that the uncached requests do not block, and each request that comes in while the cache is expired and before the task is completed ALSO trigger that task.

For example, the code below takes about 6-9 seconds on my system using Cassini. If you send two requests, approximately 2 seconds apart (i.e. two browser tabs), both will receive unique results. The last request to finish is also the response that gets cached for subsequent requests.

// CachedController.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;

namespace HttpCacheTest.Controllers
{
    public class CachedController : Controller
    {
        //
        // GET: /Cached/

        [OutputCache(Duration=20, VaryByParam="*")]
        public ActionResult Index()
        {
            var start = DateTime.Now;

            var i = Int32.MaxValue;
            while (i > 0)
            {
                i--;
            }
            var end = DateTime.Now;

            return Content( end.Subtract(start).ToString() );
        }

    }
}

You should check this information here: "You have a single client making multiple concurrent requests to the server. The default behavior is that these requests will be serialized;"

So, if the concurrent request from the single client is serialized, the subsequent request will use the cache. That explain some behavior seem in some answer above (@mats-nilsson and @nick-craver)

The context that you showed us is multiple users, that will hit you Server in the same time, and you server will get busy until have completed at least one request and created the output cache, and use it for the next request. So if you want to serialize multiple users requesting the same resource, we need to understand how the serialized request works for single user. Is that what you want?

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!