Why bitwise operation (~0);
prints -1 ? In binary , not 0 should be 1 . why ?
In standard binary encoding, 0 is all 0s, ~
is bitwise NOT. All 1s is (most often) -1 for signed integer types. So for a signed byte type:
0xFF = -1 // 1111 1111
0xFE = -2 // 1111 1110
...
0xF0 = -128 // 1000 0000
0x7F = 127 // 0111 1111
0x7E = 126 // 0111 1110
...
0x01 = 1 // 0000 0001
0x00 = 0 // 0000 0000