Java HashMap.get(Object) infinite loop

后端 未结 3 453
温柔的废话
温柔的废话 2021-02-03 23:23

A few answers on SO mention that the get method in a HashMap can fall into an infinite loop (e.g. this one or this one) if not synchronized properly (and usually the bottom line

相关标签:
3条回答
  • 2021-02-03 23:52

    You link is for the HashMap in Java 6. It was rewritten in Java 8. Prior to this rewrite an infinite loop on get(Object) was possible if there were two writing threads. I am not aware of a way the infinite loop on get can occur with a single writer.

    Specifically, the infinite loop occurs when there are two simultaneous calls to resize(int) which calls transfer:

     void transfer(Entry[] newTable, boolean rehash) {
        int newCapacity = newTable.length;
        for (Entry<K,V> e : table) {
             while(null != e) {
                 Entry<K,V> next = e.next;
                 if (rehash) {
                     e.hash = null == e.key ? 0 : hash(e.key);
                 }
                 int i = indexFor(e.hash, newCapacity);
                 e.next = newTable[i];
                 newTable[i] = e;
                 e = next;
             }
         }
     }
    

    This logic reverses the ordering of the nodes in the hash bucket. Two simultaneous reversals can make a loop.

    Look at:

                 e.next = newTable[i];
                 newTable[i] = e;
    

    If two threads are processing the same node e, then first thread executes normally but the second thread sets e.next = e, because newTable[i] has already been set to e by the first thread. The node e now points to itself, and when get(Object) is called it enters an infinite loop.

    In Java 8, the resize maintains the node ordering so a loop cannot occur in this fashion. You can lose data though.

    The Iterators for LinkedHashMap class can get stuck in an infinite loop when there are multiple readers and no writers when access ordering is being maintained. With multiple readers and access order every read removes and then inserts the accessed node from a double linked list of nodes. Multiple readers can lead to the same node being reinserted more than once into the list, causing a loop. Again the class has been rewritten for Java 8 and I do not know if this issue still exists or not.

    0 讨论(0)
  • 2021-02-03 23:52

    Situation:

    The default capacity of HashMap is 16 and Load factor is 0.75, which means HashMap will double its capacity when 12th Key-Value pair enters in the map (16 * 0.75 = 12).

    When 2 thread tries to access HashMap simultaneously, then you may encounter infinite loop. Thread 1 and Thread 2 tries to put 12th key-value pair.

    Thread 1 got execution chance:

    1. Thread 1 tries to put 12th key-value pair,
    2. Thread 1 founds that Threshold limit is reached and it creates new Buckets of increased capacity. So map's capacity is increased from 16 to 32.
    3. Thread 1 now transfers all existing key-value pairs to new buckets.
    4. Thread 1 points to first key-value pair and next(second) key-value pair to start transfer process.

    Thread 1 after pointing to key-value pairs and before starting the transfer process, loose the control and Thread 2 got a chance for execution.

    Thread 2 got execution chance:

    1. Thread 2 tries to put 12th key-value pair,
    2. Thread 2 founds that Threshold limit is reached and it creates new Buckets of increased capacity. So map's capacity is increased from 16 to 32.
    3. Thread 2 now transfers all existing key-value pairs to new buckets.
    4. Thread 2 points to first key-value pair and next(second) key-value pair to start transfer process.
    5. While transferring key-value pairs from old buckets to new buckets, key-value pairs will be reversed in new buckets because hashmap will add key-value pairs at the start and not at the end. Hashmap adds new key-value pairs at start to avoid traversing linked list every time and keep constant performance.
    6. Thread 2 will transfer all key-value pairs from old buckets to new buckets and Thread 1 will get chance for execution.

    Thread 1 got execution chance:

    1. Thread 1 before leaving control was pointing to first element and next element of old bucket.
    2. Now when Thread 1 started putting key-value pairs from old bucket to new bucket. It successfully puts (90, val) and (1, val) in new Bucket.
    3. When it tries to add next element of (1, val) which is (90, val) into new Bucket, it will end up in infinite loop.

    Solution:

    To solve this either use a Collections.synchronizedMap or ConcurrentHashMap.

    ConcurrentHashMap is thread-safe that is the code can be accessed by single thread at a time.

    HashMap can be synchronized by using Collections.synchronizedMap(hashMap) method. By using this method we get a HashMap object which is equivalent to the HashTable object. So every modification is performed on Map is locked on Map object.

    0 讨论(0)
  • 2021-02-04 00:03

    Given that the only possibility I see for an infinite loop would be e.next = e within the get method:

    for (Entry<K,V> e = table[indexFor(hash, table.length)]; e != null; e = e.next)
    

    And that could only happen in the transfer method during a resizing:

     do {
         Entry<K,V> next = e.next;
         int i = indexFor(e.hash, newCapacity);
         e.next = newTable[i]; //here e.next could point on e if the table is modified by another thread
         newTable[i] = e;
         e = next;
     } while (e != null);
    

    If only one thread is modifying the Map, I believe it is quite impossible to have an infinite loop with just one thread. It was more obvious with the old implementation of get before the jdk 6 (or 5):

    public Object get(Object key) {
            Object k = maskNull(key);
            int hash = hash(k);
            int i = indexFor(hash, table.length);
            Entry e = table[i]; 
            while (true) {
                if (e == null)
                    return e;
                if (e.hash == hash && eq(k, e.key)) 
                    return e.value;
                e = e.next;
            }
        }
    

    Even then the case still seems pretty improbable except if there are a lot of collisions.

    P.S: I'd love to be proven wrong though!

    0 讨论(0)
提交回复
热议问题