For some reason, I am simply not understanding (or seeing) why this works:
UInt32 a = 0x000000FF; a &= ~(UInt32)0x00000001;
but this does n
The bitwise negation promotes the result to a int, despite the cast. You can overcome this by bitwise and-ing the result of the bitwise negation to the lower 16 bits, like this: ~0x0001 & 0xFFFF.
int
~0x0001 & 0xFFFF