Cache object with ObjectCache in .Net with expiry time

前端 未结 4 2383
死守一世寂寞
死守一世寂寞 2021-02-20 05:25

I am stuck in a scenario. My code is like below :

Update : its not about how to use data cache, i am already using it and its working , its about expanding it so

相关标签:
4条回答
  • 2021-02-20 05:48

    Use Double-checked locking pattern:

    var cachedItem = (string)this.GetDataFromCache(cache, cacheKey);
    if (String.IsNullOrEmpty(object)) { // if no cache yet, or is expired
       lock (_lock) { // we lock only in this case
          // you have to make one more check, another thread might have put item in cache already
          cachedItem = (string)this.GetDataFromCache(cache, cacheKey); 
          if (String.IsNullOrEmpty(object)) {
              //get the data. take 100ms
              SetDataIntoCache(cache, cacheKey, cachedItem, DateTime.Now.AddMilliseconds(500));
          }
       }
    }
    

    This way, while there is an item in your cache (so, not expired yet), all requests will be completed without locking. But if there is no cache entry yet, or it expired - only one thread will get data and put it into the cache. Make sure you understand that pattern, because there are some caveats while implementing it in .NET.

    As noted in comments, it is not necessary to use one "global" lock object to protect every single cache access. Suppose you have two methods in your code, and each of those methods caches object using it's own cache key (but still using the same cache). Then you have to use two separate lock objects, because if you will use one "global" lock object, calls to one method will unnecessary wait for calls to the other method, while they never work with the same cache keys.

    0 讨论(0)
  • 2021-02-20 05:50

    I have adapted the solution from Micro Caching in .NET for use with the System.Runtime.Caching.ObjectCache for MvcSiteMapProvider. The full implementation has an ICacheProvider interface that allows swapping between System.Runtime.Caching and System.Web.Caching, but this is a cut down version that should meet your needs.

    The most compelling feature of this pattern is that it uses a lightweight version of a lazy lock to ensure that the data is loaded from the data source only 1 time after the cache expires regardless of how many concurrent threads there are attempting to load the data.

    using System;
    using System.Runtime.Caching;
    using System.Threading;
    
    public interface IMicroCache<T>
    {
        bool Contains(string key);
        T GetOrAdd(string key, Func<T> loadFunction, Func<CacheItemPolicy> getCacheItemPolicyFunction);
        void Remove(string key);
    }
    
    public class MicroCache<T> : IMicroCache<T>
    {
        public MicroCache(ObjectCache objectCache)
        {
            if (objectCache == null)
                throw new ArgumentNullException("objectCache");
    
            this.cache = objectCache;
        }
        private readonly ObjectCache cache;
        private ReaderWriterLockSlim synclock = new ReaderWriterLockSlim(LockRecursionPolicy.NoRecursion);
    
        public bool Contains(string key)
        {
            synclock.EnterReadLock();
            try
            {
                return this.cache.Contains(key);
            }
            finally
            {
                synclock.ExitReadLock();
            }
        }
    
        public T GetOrAdd(string key, Func<T> loadFunction, Func<CacheItemPolicy> getCacheItemPolicyFunction)
        {
            LazyLock<T> lazy;
            bool success;
    
            synclock.EnterReadLock();
            try
            {
                success = this.TryGetValue(key, out lazy);
            }
            finally
            {
                synclock.ExitReadLock();
            }
    
            if (!success)
            {
                synclock.EnterWriteLock();
                try
                {
                    if (!this.TryGetValue(key, out lazy))
                    {
                        lazy = new LazyLock<T>();
                        var policy = getCacheItemPolicyFunction();
                        this.cache.Add(key, lazy, policy);
                    }
                }
                finally
                {
                    synclock.ExitWriteLock();
                }
            }
    
            return lazy.Get(loadFunction);
        }
    
        public void Remove(string key)
        {
            synclock.EnterWriteLock();
            try
            {
                this.cache.Remove(key);
            }
            finally
            {
                synclock.ExitWriteLock();
            }
        }
    
    
        private bool TryGetValue(string key, out LazyLock<T> value)
        {
            value = (LazyLock<T>)this.cache.Get(key);
            if (value != null)
            {
                return true;
            }
            return false;
        }
    
        private sealed class LazyLock<T>
        {
            private volatile bool got;
            private T value;
    
            public T Get(Func<T> activator)
            {
                if (!got)
                {
                    if (activator == null)
                    {
                        return default(T);
                    }
    
                    lock (this)
                    {
                        if (!got)
                        {
                            value = activator();
    
                            got = true;
                        }
                    }
                }
    
                return value;
            }
        }
    }
    

    Usage

    // Load the cache as a static singleton so all of the threads
    // use the same instance.
    private static IMicroCache<string> stringCache = 
        new MicroCache<string>(System.Runtime.Caching.MemoryCache.Default);
    
    public string GetData(string key)
    {
        return stringCache.GetOrAdd(
            key,
            () => LoadData(key),
            () => LoadCacheItemPolicy(key));
    }
    
    private string LoadData(string key)
    {
        // Load data from persistent source here
    
        return "some loaded string";
    }
    
    private CacheItemPolicy LoadCacheItemPolicy(string key)
    {
        var policy = new CacheItemPolicy();
    
        // This ensures the cache will survive application
        // pool restarts in ASP.NET/MVC
        policy.Priority = CacheItemPriority.NotRemovable;
    
        policy.AbsoluteExpiration = DateTimeOffset.Now.AddMinutes(1);
    
        // Load Dependencies
        // policy.ChangeMonitors.Add(new HostFileChangeMonitor(new string[] { fileName }));
    
        return policy;
    }
    

    NOTE: As was previously mentioned, you are probably not gaining anything by caching a value that takes 100ms to retrieve for only 500ms. You should most likely choose a longer time period to hold items in the cache. Are the items really that volatile in the data source that they could change that quickly? If so, maybe you should look at using a ChangeMonitor to invalidate any stale data so you don't spend so much of the CPU time loading the cache. Then you can change the cache time to minutes instead of milliseconds.

    0 讨论(0)
  • 2021-02-20 06:09

    By the way, 500 milliseconds is too small time to cache, you will end up lots of CPU cycle just to add/remove cache which will eventually remove cache too soon before any other request can get benefit of cache. You should profile your code to see if it actually benefits.

    Remember, cache has lot of code in terms of locking, hashing and many other moving around data, which costs good amount of CPU cycles and remember, all though CPU cycles are small, but in multi threaded, multi connection server, CPU has lot of other things to do.

    Original Answer https://stackoverflow.com/a/16446943/85597

    private string GetDataFromCache(
                ObjectCache cache, 
                string key, 
                Func<string> valueFactory)
    {
        var newValue = new Lazy<string>(valueFactory);            
    
        //The line below returns existing item or adds 
        // the new value if it doesn't exist
        var value = cache.AddOrGetExisting(key, newValue, DateTimeOffset.Now.AddMilliseconds(500)) as Lazy<string>;
        // Lazy<T> handles the locking itself
        return (value ?? newValue).Value;
    }
    
    
    // usage...
    
    
    object = this.GetDataFromCache(cache, cacheKey, () => {
    
          // get the data...
    
          // this method will be called only once..
    
          // Lazy will automatically do necessary locking
          return data;
    });
    
    0 讨论(0)
  • 2021-02-20 06:10

    You will have to use locking to make sure request is not send when cache is expired and another thread is getting it from remote/slow service, it will look something like this (there are better implementations out there that are easier to use, but they require separate classes):

    private static readonly object _Lock = new object();
    
    ...
    
    object = (string)this.GetDataFromCache(cache, cacheKey);
    
    if(object == null)
    {
       lock(_Lock)
       {
            object = (string)this.GetDataFromCache(cache, cacheKey);
            if(String.IsNullOrEmpty(object))
            {
               get the data // take 100ms
               SetDataIntoCache(cache, cacheKey, object, DateTime.Now.AddMilliseconds(500));
            }
       }
    }
    
    return object;
    

    Also, you want to make sure your service doesn't return null as it will assume that no cache exists and will try to get the data on every request. That is why more advanced implementations typically use something like CacheObject, which supports null values storage.

    0 讨论(0)
提交回复
热议问题