I have created a async cache that uses .NET MemoryCache
underneath.
This is the code:
public async Task GetAsync(string key, Func
A simple solution would be to use SemaphoreSlim.WaitAsync() instead of a lock, and then you could get around the issue of awaiting inside a lock. Although, all other methods of MemoryCache
are thread-safe.
private SemaphoreSlim semaphoreSlim = new SemaphoreSlim(1);
public async Task<T> GetAsync(
string key, Func<Task<T>> populator, TimeSpan expire, object parameters)
{
if (parameters != null)
key += JsonConvert.SerializeObject(parameters);
if (!_cache.Contains(key))
{
await semaphoreSlim.WaitAsync();
try
{
if (!_cache.Contains(key))
{
var data = await populator();
_cache.Add(key, data, DateTimeOffset.Now.Add(expire));
}
}
finally
{
semaphoreSlim.Release();
}
}
return (T)_cache.Get(key);
}
Although there is an already accepted answer, I'll post a new one with Lazy<T>
approach. Idea is: to minimize the duration of lock
block, if the key doesn't exists in cache, put a Lazy<T>
to cache. That way all threads using the same key at the same time will be waiting the same Lazy<T>
's value
public Task<T> GetAsync<T>(string key, Func<Task<T>> populator, TimeSpan expire, object parameters)
{
if (parameters != null)
key += JsonConvert.SerializeObject(parameters);
lock (_cache)
{
if (!_cache.Contains(key))
{
var lazy = new Lazy<Task<T>>(populator, true);
_cache.Add(key, lazy, DateTimeOffset.Now.Add(expire));
}
}
return ((Lazy<Task<T>>)_cache.Get(key)).Value;
}
Version2
public Task<T> GetAsync<T>(string key, Func<Task<T>> populator, TimeSpan expire, object parameters)
{
if (parameters != null)
key += JsonConvert.SerializeObject(parameters);
var lazy = ((Lazy<Task<T>>)_cache.Get(key));
if (lazy != null) return lazy.Value;
lock (_cache)
{
if (!_cache.Contains(key))
{
lazy = new Lazy<Task<T>>(populator, true);
_cache.Add(key, lazy, DateTimeOffset.Now.Add(expire));
return lazy.Value;
}
return ((Lazy<Task<T>>)_cache.Get(key)).Value;
}
}
Version3
public Task<T> GetAsync<T>(string key, Func<Task<T>> populator, TimeSpan expire, object parameters)
{
if (parameters != null)
key += JsonConvert.SerializeObject(parameters);
var task = (Task<T>)_cache.Get(key);
if (task != null) return task;
var value = populator();
return
(Task<T>)_cache.AddOrGetExisting(key, value, DateTimeOffset.Now.Add(expire)) ?? value;
}
This is a attempted improvement on Eser's answer (Version2). The Lazy class is thread safe by default, so the lock
can be removed. It is possible that multiple Lazy
objects will be created for a given key, but only one will have it's Value
property queried, causing the starting of the heavy Task
. The other Lazy
s will remain unused, and will fall out of scope and become garbage collected soon.
The first overload is the flexible and generic one, and accepts a Func<CacheItemPolicy>
argument. I included two more overloads for the most common cases of absolute and sliding expiration. Many more overloads could be added for convenience.
using System.Runtime.Caching;
static partial class MemoryCacheExtensions
{
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, string key,
Func<Task<T>> valueFactory, Func<CacheItemPolicy> cacheItemPolicyFactory)
{
var lazyTask = (Lazy<Task<T>>)cache.Get(key);
if (lazyTask != null) return lazyTask.Value.ToAsyncConditional();
lazyTask = new Lazy<Task<T>>(valueFactory);
var cacheItem = new CacheItem(key, lazyTask);
var cacheItemPolicy = cacheItemPolicyFactory?.Invoke();
var existingCacheItem = cache.AddOrGetExisting(cacheItem, cacheItemPolicy);
return ((Lazy<Task<T>>)(existingCacheItem?.Value ?? cacheItem.Value)).Value
.ToAsyncConditional();
}
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, string key,
Func<Task<T>> valueFactory, DateTimeOffset absoluteExpiration)
{
return cache.GetOrCreateLazyAsync(key, valueFactory, () => new CacheItemPolicy()
{
AbsoluteExpiration = absoluteExpiration,
});
}
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, string key,
Func<Task<T>> valueFactory, TimeSpan slidingExpiration)
{
return cache.GetOrCreateLazyAsync(key, valueFactory, () => new CacheItemPolicy()
{
SlidingExpiration = slidingExpiration,
});
}
private static Task<TResult> ToAsyncConditional<TResult>(this Task<TResult> task)
{
if (task.IsCompleted) return task;
return task.ContinueWith(async t => await t,
default, TaskContinuationOptions.RunContinuationsAsynchronously,
TaskScheduler.Default).Unwrap();
}
}
Usage example:
string html = await MemoryCache.Default.GetOrCreateLazyAsync("MyKey", async () =>
{
return await new WebClient().DownloadStringTaskAsync("https://stackoverflow.com");
}, DateTimeOffset.Now.AddMinutes(10));
The HTML of this site is downloaded and cached for 10 minutes. Multiple concurrent requests will await
the same task to complete.
The System.Runtime.Caching.MemoryCache class is easy to use, but has limited support for prioritizing the cache entries. Basically there are only two options, Default
and NotRemovable
, meaning it's hardly adequate for advanced scenarios. The newer Microsoft.Extensions.Caching.Memory.MemoryCache class (from this package) offers more options regarding cache priorities (Low
, Normal
, High
and NeverRemove
), but otherwise is less intuitive and more cumbersome to use. It offers async capabilities, but not lazy. So here are the LazyAsync equivalent extensions for this class:
using Microsoft.Extensions.Caching.Memory;
static partial class MemoryCacheExtensions
{
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, object key,
Func<ICacheEntry, Task<T>> factory)
{
return cache.GetOrCreate(key, e =>
{
return new Lazy<Task<T>>(() => factory(e));
}).Value.ToAsyncConditional();
}
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, object key,
Func<Task<T>> valueFactory, DateTimeOffset absoluteExpiration)
{
return cache.GetOrCreateLazyAsync(key, e =>
{
e.AbsoluteExpiration = absoluteExpiration;
return valueFactory();
});
}
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, object key,
Func<Task<T>> valueFactory, TimeSpan slidingExpiration)
{
return cache.GetOrCreateLazyAsync(key, e =>
{
e.SlidingExpiration = slidingExpiration;
return valueFactory();
});
}
}
Usage example:
var cache = new MemoryCache(new MemoryCacheOptions());
string html = await cache.GetOrCreateLazyAsync("MyKey", async () =>
{
return await new WebClient().DownloadStringTaskAsync("https://stackoverflow.com");
}, DateTimeOffset.Now.AddMinutes(10));
Update: I just became aware of a peculiar feature of the async
-await
mechanism. When an incomplete Task
is awaited multiple times concurrently, the continuations will run synchronously (in the same thread) one after the other (assuming that there is no synchronization context). This can be an issue for the above implementations of GetOrCreateLazyAsync
, because it is possible for blocking code to exist immediately after an awaited call to GetOrCreateLazyAsync
, in which case other awaiters will be affected (delayed, or even deadlocked). A possible solution to this problem is to return an asynchronous continuation of the lazily created Task
, instead of the task itself, but only if the task is incomplete. This is the reason for the introduction of the ToAsyncConditional
method above.
The current answers use the somewhat outdated System.Runtime.Caching.MemoryCache
. They also contain subtle race conditions (see comments). Finally, not all of them allow the timeout to be dependent on the value to be cached.
Here's my attempt using the new Microsoft.Extensions.Caching.Memory (used by ASP.NET Core):
//Add NuGet package: Microsoft.Extensions.Caching.Memory
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.Primitives;
MemoryCache _cache = new MemoryCache(new MemoryCacheOptions());
public Task<T> GetOrAddAsync<T>(
string key, Func<Task<T>> factory, Func<T, TimeSpan> expirationCalculator)
{
return _cache.GetOrCreateAsync(key, async cacheEntry =>
{
var cts = new CancellationTokenSource();
cacheEntry.AddExpirationToken(new CancellationChangeToken(cts.Token));
var value = await factory().ConfigureAwait(false);
cts.CancelAfter(expirationCalculator(value));
return value;
});
}
Sample usage:
await GetOrAddAsync("foo", () => Task.Run(() => 42), i => TimeSpan.FromMilliseconds(i)));
Note that it is not guaranteed for the factory method to be called only once (see https://github.com/aspnet/Caching/issues/240).