I have used LinkedHashMap
with accessOrder
true along with allowing a maximum of 500 entries at any time as the LRU cache for data. But due to scalabil
I did something similar recently with ConcurrentHashMap
, where CacheEntry wraps the actual item and adds cache eviction statistics: expiration time, insertion time (for FIFO/LIFO eviction), last used time (for LRU/MRU eviction), number of hits (for LFU/MFU eviction), etc. The actual eviction is synchronized and creates an ArrayList
and does a Collections.sort() on it using the appropriate Comparator for the eviction strategy. Since this is expensive, each eviction then lops off the bottom 5% of the CacheEntries. I'm sure performance tuning would help though.
In your case, since you're doing FIFO, you could keep a separate ConcurrentLinkedQueue. When you add an object to the ConcurrentHashMap, do a ConcurrentLinkedQueue.add() of that object. When you want to evict an entry, do a ConcurrentLinkedQueue.poll() to remove the oldest object, then remove it from the ConcurrentHashMap as well.
Update: Other possibilities in this area include a Java Collections synchronization wrapper and the Java 1.6 ConcurrentSkipListMap.