Is it there any LRU implementation of IDictionary?

前端 未结 10 1738
走了就别回头了
走了就别回头了 2020-11-29 21:20

I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mec

相关标签:
10条回答
  • 2020-11-29 21:32

    I implemented a thread safe pseudo LRU designed for concurrent workloads. Performance is very close to ConcurrentDictionary, ~10x faster than MemoryCache and hit rate is better than a conventional LRU. Full analysis provided in the github link below.

    Usage looks like this:

    int capacity = 666;
    var lru = new ConcurrentLru<int, SomeItem>(capacity);
    
    var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
    

    GitHub: https://github.com/bitfaster/BitFaster.Caching

    Install-Package BitFaster.Caching
    
    0 讨论(0)
  • 2020-11-29 21:33

    I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.

    Basic features:

    • Linked Concurrent Dictionary by Insertion, Modification, or Access
    • Dictionary/ConcurrentDictionary interface support
    • Peek/TryDequeue/Dequeue access to 'oldest' entry
    • Allows hard-limit on items enforced at insertion
    • Exposes events for add, update, and remove

    Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs

    GitHub: https://github.com/csharptest/CSharpTest.Net.Collections

    HTML Help: http://help.csharptest.net/

    PM> Install-Package CSharpTest.Net.Collections

    0 讨论(0)
  • 2020-11-29 21:35

    I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).

    It's pretty simple to implement, and usually gets done by creating a class which contains both a Dictionary and a List.

    The keys go in the list (in-order) and the items go in the dictionary.
    When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really

    0 讨论(0)
  • 2020-11-29 21:37

    I like Lawrence's implementation. Hashtable + LinkedList is a good solution.

    Regarding threading, I would not lock this with[MethodImpl(MethodImplOptions.Synchronized)], but rather use ReaderWriterLockSlim or spin lock (since contention usually fast) instead.

    In the Get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you the possibility to keep that within a reader lock that is not blocking other readers.

    0 讨论(0)
  • 2020-11-29 21:38

    This a very simple and fast implementation we developed for a web site we own.

    We tried to improve the code as much as possible, while keeping it thread safe. I think the code is very simple and clear, but if you need some explanation or a guide related to how to use it, don't hesitate to ask.

    namespace LRUCache
    {
        public class LRUCache<K,V>
        {
            private int capacity;
            private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
            private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();
    
            public LRUCache(int capacity)
            {
                this.capacity = capacity;
            }
    
            [MethodImpl(MethodImplOptions.Synchronized)]
            public V get(K key)
            {
                LinkedListNode<LRUCacheItem<K, V>> node;
                if (cacheMap.TryGetValue(key, out node))
                {
                    V value = node.Value.value;
                    lruList.Remove(node);
                    lruList.AddLast(node);
                    return value;
                }
                return default(V);
            }
    
            [MethodImpl(MethodImplOptions.Synchronized)]
            public void add(K key, V val)
            {
                if (cacheMap.Count >= capacity)
                {
                    RemoveFirst();
                }
    
                LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
                LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
                lruList.AddLast(node);
                cacheMap.Add(key, node);
            }
    
            private void RemoveFirst()
            {
                // Remove from LRUPriority
                LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
                lruList.RemoveFirst();
    
                // Remove from cache
                cacheMap.Remove(node.Value.key);
            }
        }
    
        class LRUCacheItem<K,V>
        {
            public LRUCacheItem(K k, V v)
            {
                key = k;
                value = v;
            }
            public K key;
            public V value;
        }
    }
    
    0 讨论(0)
  • 2020-11-29 21:41

    This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.

    namespace LruCache
    {
        using System;
        using System.Collections.Generic;
    
        /// <summary>
        /// A least-recently-used cache stored like a dictionary.
        /// </summary>
        /// <typeparam name="TKey">
        /// The type of the key to the cached item
        /// </typeparam>
        /// <typeparam name="TValue">
        /// The type of the cached item.
        /// </typeparam>
        /// <remarks>
        /// Derived from https://stackoverflow.com/a/3719378/240845
        /// </remarks>
        public class LruCache<TKey, TValue>
        {
            private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
                new Dictionary<TKey, LinkedListNode<LruCacheItem>>();
    
            private readonly LinkedList<LruCacheItem> lruList =
                new LinkedList<LruCacheItem>();
    
            private readonly Action<TValue> dispose;
    
            /// <summary>
            /// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
            /// class.
            /// </summary>
            /// <param name="capacity">
            /// Maximum number of elements to cache.
            /// </param>
            /// <param name="dispose">
            /// When elements cycle out of the cache, disposes them. May be null.
            /// </param>
            public LruCache(int capacity, Action<TValue> dispose = null)
            {
                this.Capacity = capacity;
                this.dispose = dispose;
            }
    
            /// <summary>
            /// Gets the capacity of the cache.
            /// </summary>
            public int Capacity { get; }
    
            /// <summary>Gets the value associated with the specified key.</summary>
            /// <param name="key">
            /// The key of the value to get.
            /// </param>
            /// <param name="value">
            /// When this method returns, contains the value associated with the specified
            /// key, if the key is found; otherwise, the default value for the type of the 
            /// <paramref name="value" /> parameter. This parameter is passed
            /// uninitialized.
            /// </param>
            /// <returns>
            /// true if the <see cref="T:System.Collections.Generic.Dictionary`2" /> 
            /// contains an element with the specified key; otherwise, false.
            /// </returns>
            public bool TryGetValue(TKey key, out TValue value)
            {
                lock (this.cacheMap)
                {
                    LinkedListNode<LruCacheItem> node;
                    if (this.cacheMap.TryGetValue(key, out node))
                    {
                        value = node.Value.Value;
                        this.lruList.Remove(node);
                        this.lruList.AddLast(node);
                        return true;
                    }
    
                    value = default(TValue);
                    return false;
                }
            }
    
            /// <summary>
            /// Looks for a value for the matching <paramref name="key"/>. If not found, 
            /// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
            /// the cache.
            /// </summary>
            /// <param name="key">
            /// The key of the value to look up.
            /// </param>
            /// <param name="valueGenerator">
            /// Generates a value if one isn't found.
            /// </param>
            /// <returns>
            /// The requested value.
            /// </returns>
            public TValue Get(TKey key, Func<TValue> valueGenerator)
            {
                lock (this.cacheMap)
                {
                    LinkedListNode<LruCacheItem> node;
                    TValue value;
                    if (this.cacheMap.TryGetValue(key, out node))
                    {
                        value = node.Value.Value;
                        this.lruList.Remove(node);
                        this.lruList.AddLast(node);
                    }
                    else
                    {
                        value = valueGenerator();
                        if (this.cacheMap.Count >= this.Capacity)
                        {
                            this.RemoveFirst();
                        }
    
                        LruCacheItem cacheItem = new LruCacheItem(key, value);
                        node = new LinkedListNode<LruCacheItem>(cacheItem);
                        this.lruList.AddLast(node);
                        this.cacheMap.Add(key, node);
                    }
    
                    return value;
                }
            }
    
            /// <summary>
            /// Adds the specified key and value to the dictionary.
            /// </summary>
            /// <param name="key">
            /// The key of the element to add.
            /// </param>
            /// <param name="value">
            /// The value of the element to add. The value can be null for reference types.
            /// </param>
            public void Add(TKey key, TValue value)
            {
                lock (this.cacheMap)
                {
                    if (this.cacheMap.Count >= this.Capacity)
                    {
                        this.RemoveFirst();
                    }
    
                    LruCacheItem cacheItem = new LruCacheItem(key, value);
                    LinkedListNode<LruCacheItem> node = 
                        new LinkedListNode<LruCacheItem>(cacheItem);
                    this.lruList.AddLast(node);
                    this.cacheMap.Add(key, node);
                }
            }
    
            private void RemoveFirst()
            {
                // Remove from LRUPriority
                LinkedListNode<LruCacheItem> node = this.lruList.First;
                this.lruList.RemoveFirst();
    
                // Remove from cache
                this.cacheMap.Remove(node.Value.Key);
    
                // dispose
                this.dispose?.Invoke(node.Value.Value);
            }
    
            private class LruCacheItem
            {
                public LruCacheItem(TKey k, TValue v)
                {
                    this.Key = k;
                    this.Value = v;
                }
    
                public TKey Key { get; }
    
                public TValue Value { get; }
            }
        }
    }
    
    0 讨论(0)
提交回复
热议问题