ASP.NET (MVC) Outputcache and concurrent requests

ぐ巨炮叔叔 提交于 2019-12-05 18:53:26

问题


Let's say that, theoratically, I have a page / controller action in my website that does some very heavy stuff. It takes about 10 seconds to complete it's operation.

Now, I use .NET's outputcache mechanism to cache it for 15 minutes (for examle, I use [OutputCache(Duration = 900)]) What happens if, after 15 minutes, the cache is expired and 100 users request the page again within those 10 seconds that it takes to do the heavy processing?

  1. The heavy stuff is done only the first time, and there is some locking mechanism so that the other 99 users will get the cache result
  2. The heavy stuff is done 100 times (and the server is crippled as it can take up to 100 * 10 seconds)

Easy question maybe, but I'm not 100% sure. I hope it is number one, though :-)

Thanks!


回答1:


Well, it depends upon how you have IIS configured. If you have less than 100 worker threads (let's say, 50), then the "heavy stuff" is done 50 times, crippling your server, and then the remaining 50 requests will be served from cache.

But no, there is no "locking mechanism" on a cached action result; that would be counterproductive, for the most part.

Edit: I believe this to be true, but Nick's tests say otherwise, and I don't have time to test now. Try it yourself! The rest of the answer is not dependent on the above, though, and I think it's more important.

Generally speaking, however, no web request, cached or otherwise, should take 10 seconds to return. If I were in your shoes, I would look at somehow pre-computing the hard part of the request. You can still cache the action result if you want to cache the HTML, but it sounds like your problem is somewhat bigger than that.

You might also want to consider asynchronous controllers. Finally, note that although IIS and ASP.NET MVC will not lock on this heavy computation, you could. If you use asynchronous controllers combined with a lock on the computation, then you would get effectively the behavior you're asking for. I can't really say if that's the best solution without knowing more about what your doing.




回答2:


It seems to lock here, doing a simple test:

<%@ OutputCache Duration="10" VaryByParam="*" %>

protected void Page_Load(object sender, EventArgs e)
{
    System.Threading.Thread.Sleep(new Random().Next(1000, 30000));
}

The first page hits the a breakpoint there, even though it's left sleeping...no other request hits a breakpoint in the Page_Load method...it waits for the first one to complete and returns that result to everyone who's requested that page.

Note: this was simpler to test in a webforms scenario, but given this is a shared aspect of the frameworks, you can do the same test in MVC with the same result.

Here's an alternative way to test:

<asp:Literal ID="litCount" runat="server" />

public static int Count = 0;

protected void Page_Load(object sender, EventArgs e)
{
  litCount.Text = Count++.ToString();
  System.Threading.Thread.Sleep(10000);
}

All pages queued up while the first request goes to sleep will have the same count output.




回答3:


Old question, but I ran in to this problem, and did some investigation.

Example code:

public static int Count;
[OutputCache(Duration = 20, VaryByParam = "*")]
public ActionResult Test()
{
    var i = Int32.MaxValue;
    System.Threading.Thread.Sleep(4000);
    return Content(Count++);
}

Run it in one browser, and it seems to lock and wait.

Run it in different browsers (I tested in IE and firefox) and the requests are not put on hold.

So the "correct" behaviour has more to do with which browser you are using than the function in IIS.

Edit: To clarify - No lock. The server gets hit by all requests that manage to get in before the first result is cached, possibly resulting in a hard hit on the server for heavy requests. (Or if you call an external system, that system could be brought down if your server serves many requests...)




回答4:


I made a small test that might help. I believe what I've discovered is that the uncached requests do not block, and each request that comes in while the cache is expired and before the task is completed ALSO trigger that task.

For example, the code below takes about 6-9 seconds on my system using Cassini. If you send two requests, approximately 2 seconds apart (i.e. two browser tabs), both will receive unique results. The last request to finish is also the response that gets cached for subsequent requests.

// CachedController.cs
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;

namespace HttpCacheTest.Controllers
{
    public class CachedController : Controller
    {
        //
        // GET: /Cached/

        [OutputCache(Duration=20, VaryByParam="*")]
        public ActionResult Index()
        {
            var start = DateTime.Now;

            var i = Int32.MaxValue;
            while (i > 0)
            {
                i--;
            }
            var end = DateTime.Now;

            return Content( end.Subtract(start).ToString() );
        }

    }
}



回答5:


You should check this information here: "You have a single client making multiple concurrent requests to the server. The default behavior is that these requests will be serialized;"

So, if the concurrent request from the single client is serialized, the subsequent request will use the cache. That explain some behavior seem in some answer above (@mats-nilsson and @nick-craver)

The context that you showed us is multiple users, that will hit you Server in the same time, and you server will get busy until have completed at least one request and created the output cache, and use it for the next request. So if you want to serialize multiple users requesting the same resource, we need to understand how the serialized request works for single user. Is that what you want?



来源:https://stackoverflow.com/questions/2162433/asp-net-mvc-outputcache-and-concurrent-requests

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!