I love a HashSet<>() and use this eagerly while initializing this with the default constructor:
Set users = new HashSet<>();
If you see docs
There is some clue.
Iterating over this set requires time proportional to the sum of the HashSet instance's size (the number of elements) plus the "capacity" of the backing HashMap instance (the number of buckets). Thus, it's very important not to set the initial capacity too high (or the load factor too low) if iteration performance is important.
The default initial capacity is 16, so by passing in 0 you may save a few bytes of memory if you end up not putting anything in the set.
Other than that there is no real advantage; when you pass 0 the set is created with a capacity of 1 and as soon as you add something it will have to be resized.
HashSet use HashMap store data:
public HashSet(int initialCapacity) {
map = new HashMap<E,Object>(initialCapacity);
}
while the initialCapacity = 0,
public HashMap(int initialCapacity, float loadFactor) {
....
// Find a power of 2 >= initialCapacity
int capacity = 1;
while (capacity < initialCapacity)
capacity <<= 1;
}
the HashMap capacity is 1
.
but if use default constructor:
public HashMap() {
this.loadFactor = DEFAULT_LOAD_FACTOR;
threshold = (int)(DEFAULT_INITIAL_CAPACITY * DEFAULT_LOAD_FACTOR);
table = new Entry[DEFAULT_INITIAL_CAPACITY];
init();
}
the HashMap capacity is 16*0.75
.
So, new HashSet<>(0)
save some memroy when init.
This will set it to the minimum.
Most likely this is used to turn off code analysers which can complain if you haven't set an initial capacity for collections. By setting it to 0 you just set it to the minimum.
It is not much of an optimisation because as soon as you add an entry, the load factor of 0.7 will make the capacity 2, recreating the Map.Entry[]
in the process.
The Initial Load factor of HashMap is 16. When HashMap holds data of 12 Records which is 75% of its initial size. then HashMap increases its size.
So here we just set the initial capacity as 0 by passing it in Constructor.