Per-key blocking Map in Java

后端 未结 3 1410
孤城傲影
孤城傲影 2021-02-07 23:02

I\'m dealing with some third-party library code that involves creating expensive objects and caching them in a Map. The existing implementation is something like

3条回答
  •  小鲜肉
    小鲜肉 (楼主)
    2021-02-07 23:34

    Creating a lock per key sounds tempting, but it may not be what you want, especially when the number of keys is large.

    As you would probably need to create a dedicated (read-write) lock for each key, it has impact on your memory usage. Also, that fine granularity may hit a point of diminishing returns given a finite number of cores if concurrency is truly high.

    ConcurrentHashMap is oftentimes a good enough solution in a situation like this. It provides normally full reader concurrency (normally readers do not block), and updates can be concurrent up to the level of concurrency level desired. This gives you pretty good scalability. The above code may be expressed with ConcurrentHashMap like the following:

    ConcurrentMap cache = new ConcurrentHashMap<>();
    ...
    Foo result = cache.get(key);
    if (result == null) {
      result = createFooExpensively(key);
      Foo old = cache.putIfAbsent(key, result);
      if (old != null) {
        result = old;
      }
    }
    

    The straightforward use of ConcurrentHashMap does have one drawback, which is that multiple threads may find that the key is not cached, and each may invoke createFooExpensively(). As a result, some threads may do throw-away work. To avoid this, you would want to use the memoizer pattern that's mentioned in "Java Concurrency in Practice".

    But then again, the nice folks at Google already solved these problems for you in the form of CacheBuilder:

    LoadingCache cache = CacheBuilder.newBuilder().
      concurrencyLevel(32).
      build(new CacheLoader() {
        public Foo load(Key key) {
          return createFooExpensively(key);
        }
      });
    
    ...
    Foo result = cache.get(key);
    

提交回复
热议问题