Inconsistent behaviour of implicit conversion between unsigned and bigger signed types

前端 未结 5 875
情话喂你
情话喂你 2020-12-31 08:29

Consider following example:

#include 

int main(void)
{
    unsigned char a  = 15; /* one byte */
    unsigned short b = 15; /* two bytes */
          


        
5条回答
  •  说谎
    说谎 (楼主)
    2020-12-31 08:45

    int is special. Everything smaller than int gets promoted to int in arithmetic operations.

    Thus -a and -b are applications of unary minus to int values of 15, which just work and produce -15. This value is then converted to long.

    -c is different. c is not promoted to an int as it is not smaller than int. The result of unary minus applied to an unsigned int value of k is again an unsigned int, computed as 2N-k (N is the number of bits).

    Now this unsigned int value is converted to long normally.

提交回复
热议问题