Java code to Prevent duplicate pairs in HashMap/HashTable

前端 未结 7 1953
梦谈多话
梦谈多话 2021-01-17 02:32

I have a HashMap as below (assuming it has 10,0000 elements)

HashMap hm = new HashMap();
hm.put(\

7条回答
  •  隐瞒了意图╮
    2021-01-17 03:00

    This may be old question but I thought to share my experience with this. As others pointed out you can't have the same element in a HashMap. By default HashMap will not allow this but there are some cases that you could end up with two or more elements are almost alike that you do not accept but HashMap will. For example, the following code defines a HashMap that takes an array of integers as a key then add :

    HashMap map1 = new HashMap<>();
    int[] arr = new int[]{1,2,3};
    map1.put(arr, 4);
    map1.put(arr, 4);
    map1.put(arr, 4);
    

    At this point, the HashMap did not allow dublicating the key and map1.size() will return 1. However, if you added elements without creating the array first things will be different:

    HashMap map2 = new HashMap<>();
    map2.put(new int[]{4,5,6}, 6);
    map2.put(new int[]{4,5,6}, 6);
    map2.put(new int[]{4,5,6}, 6);
    

    This way, the HashMap will add all the three new elements so the map2.size() will return 3 and not 1 as expected.

    The explanation is that with the first map I created the object arr once and tried to add the same object 3 times which HashMap does not allow by default so only the last usage will be considered. With the second map, however, evey time I recreate a new object on the stack. The three objects created are different and separated thought the three of them have the same data but they are different. That's why HashMap allowed them as different keys.

    Bottom line, you don't need to prevent HashMap from adding dublicated keys because it won't by design. However, you have to watch out how you define these keys because the fault may be on your side.

提交回复
热议问题