Recursive ConcurrentHashMap.computeIfAbsent() call never terminates. Bug or “feature”?

前端 未结 3 814
隐瞒了意图╮
隐瞒了意图╮ 2020-11-28 05:51

Some time ago, I\'ve blogged about a Java 8 functional way of calculating fibonacci numbers recursively, with a ConcurrentHashMap cache and the new, useful

相关标签:
3条回答
  • 2020-11-28 06:10

    This is fixed in JDK-8062841.

    In the 2011 proposal, I identified this issue during the code review. The JavaDoc was updated and a temporary fix was added. It was removed in a further rewrite due to performance issues.

    In the 2014 discussion, we explored ways to better detect and fail. Note that some of the discussion was taken offline to private email for considering the low-level changes. While not every case can be covered, the common cases will not livelock. These fixes are in Doug's repository but have not made it into a JDK release.

    0 讨论(0)
  • 2020-11-28 06:18

    This is of course a "feature". The ConcurrentHashMap.computeIfAbsent() Javadoc reads:

    If the specified key is not already associated with a value, attempts to compute its value using the given mapping function and enters it into this map unless null. The entire method invocation is performed atomically, so the function is applied at most once per key. Some attempted update operations on this map by other threads may be blocked while computation is in progress, so the computation should be short and simple, and must not attempt to update any other mappings of this map.

    The "must not" wording is a clear contract, which my algorithm violated, although not for the same concurrency reasons.

    What's still interesting is that there is no ConcurrentModificationException. Instead, the program just never halts - which still is a rather dangerous bug in my opinion (i.e. infinite loops. or: anything that can possibly go wrong, does).

    Note:

    The HashMap.computeIfAbsent() or Map.computeIfAbsent() Javadoc don't forbid such recursive computation, which is of course ridiculous as the type of the cache is Map<Integer, Integer>, not ConcurrentHashMap<Integer, Integer>. It is very dangerous for subtypes to drastically re-define super type contracts (Set vs. SortedSet is greeting). It should thus be forbidden also in super types, to perform such recursion.

    0 讨论(0)
  • 2020-11-28 06:30

    This is very similar to the bug. Because, if you create your cache with capacity 32, your program will work until 49. And it is interesting, that parameter sizeCtl =32 + (32 >>> 1) + 1) =49! May be the reason in resizing?

    0 讨论(0)
提交回复
热议问题