Am I missing something painfully obvious? Or does just nobody in the world actually use java.util.BitSet?
The following test fails:
@Test
public voi
// Abhay Dandekar
import java.util.BitSet;
public class TestBitSet {
public static void main(String[] args) {
BitSet bitSet = new BitSet();
System.out.println("State 0 : " + bitSet.size() + " : " + bitSet.length() );
bitSet.set(0, true);
bitSet.set(1, true);
System.out.println("State 1 : " + bitSet.size() + " : " + bitSet.length() );
bitSet.set(2, false);
bitSet.set(3, false);
System.out.println("State 2 : " + bitSet.size() + " : " + bitSet.length() );
bitSet.set(4, true);
System.out.println("State 3 : " + bitSet.size() + " : " + bitSet.length() );
}
}
A simple java program to show what happens inside. Some points to note :
BitSet is backed by a long
All the default values are false
While returning the length, it returns the index+1 of the highest "true" value in the set.
The output below should be able to explain itself :
State 0 : 64 : 0
State 1 : 64 : 2
State 2 : 64 : 2
State 3 : 64 : 5
So points to conclude :
Do not use the length to conclude the no of bits modified
Can be used in scenarios like bloom filters. More on bloom filters can be googled .. ;)
Hope this helps
Regards,
Abhay Dandekar