Why can't you add an int and a char in some cases?

前端 未结 1 1859
渐次进展
渐次进展 2021-02-01 12:46

Why does

char ch = \'4\';
ch = \'4\' + 2;

work, but

char ch = \'4\';
ch = ch  + 2;

doesn\'t?

1条回答
  •  傲寒
    傲寒 (楼主)
    2021-02-01 13:29

    To understand this, lets consider what the compiler does at each step for both possibilities. Lets start with:

    ch = '4' + 2;
    

    The compiler converts '4' to an int. So it becomes

    ch = 52 + 2;
    

    Which the compiler then turns into

    ch = 54;
    

    ch is a char, and the compiler is allowed to convert 54 to a char as it can prove that there is no loss in the conversion.

    Now lets consider the second version:

    ch = ch  + 2;
    

    ch has no known value at compile time. Thus this becomes

    ch = ((int) ch) + 2;
    

    Now the compiler cannot prove that the result of this (an int) is storable within the range of a char. So it will not automatically narrow it, and reports it as an error.

    EDIT1:

    If the compiler can prove that the variable will never change, and is inlineable. Then the second form can be turned into the first. Subir pointed out that adding 'final' makes this possible. Although if a compiler was to perform change analysis then it is technically capable of figuring this out without the final keyword, but final does make it easier for the compiler and readers of the code.

    EDIT2:

    Narrowing of int to char is covered in the Java Language Spec, the link was kindly provided by Jon Skeet.

    0 讨论(0)
提交回复
热议问题