I\'m looking to implement a simple cache without doing too much work (naturally). It seems to me that one of the standard Java collections ought to suffice, with a little extra
If you use SoftReference
-based keys, the VM will bias (strongly) against recently accessed objects. However it would be quite difficult to determine the caching semantics - the only guarantee that a SoftReference gives you (over a WeakReference) is that it will be cleared before an OutOfMemoryError
is thrown. It would be perfectly legal for a JVM implementation to treat them identically to WeakReferences, at which point you might end up with a cache that doesn't cache anything.
I don't know how things work on Android, but with Sun's recent JVMs one can tweak the SoftReference behaviour with the -XX:SoftRefLRUPolicyMSPerMB command-line option, which determines the number of milliseconds that a softly-reachable object will be retained for, per MB of free memory in the heap. As you can see, this is going to be exceptionally difficult to get any predictable lifespan behaviour out of, with the added pain that this setting is global for all soft references in the VM and can't be tweaked separately for individual classes' use of SoftReferences (chances are each use will want different parameters).
The simplest way to make an LRU cache is by extending LinkedHashMap as described here. Since you need thread-safety, the simplest way to extend this initially is to just use Collections.synchronizedMap on an instance of this custom class to ensure safe concurrent behaviour.
Beware premature optimisation - unless you need very high throughput, the theoretically suboptimal overhead of the coarse synchronization is not likely to be an issue. And the good news - if profiling shows that you are performing too slowly due to heavy lock contention, you'll have enough information available about the runtime use of your cache that you'll be able to come up with a suitable lockless alternative (probably based on ConcurrentHashMap with some manual LRU treatment) rather than having to guess at its load profile.