Error when trying to perform a bitwise not (~) on a UInt16 in C#

后端 未结 3 1828
醉酒成梦
醉酒成梦 2021-01-26 05:52

For some reason, I am simply not understanding (or seeing) why this works:

UInt32 a = 0x000000FF;
a &= ~(UInt32)0x00000001;

but this does n

3条回答
  •  执笔经年
    2021-01-26 06:08

    The bitwise negation promotes the result to a int, despite the cast. You can overcome this by bitwise and-ing the result of the bitwise negation to the lower 16 bits, like this: ~0x0001 & 0xFFFF.

提交回复
热议问题