Java: Subtract '0' from char to get an int… why does this work?

前端 未结 9 573
伪装坚强ぢ
伪装坚强ぢ 2020-11-27 05:04

This works fine:

int foo = bar.charAt(1) - \'0\';

Yet this doesn\'t - because bar.charAt(x) returns a char:

int foo = bar.c         


        
相关标签:
9条回答
  • 2020-11-27 05:22

    Your code may compile without error & run without throwing an exception, but converting between char's & int's is bad practice. First, it makes the code confusing, leading to maintenance headaches down the road. Second, clever "tricks" can prevent compilers from optimizing the byte code. One of the best ways to get fast code is to write dumb code (i.e., not clever code).

    0 讨论(0)
  • 2020-11-27 05:27

    That's a clever trick. char's are actually of the same type / length as shorts. Now when you have a char that represents a ASCII/unicode digit (like '1'), and you subtract the smallest possible ASCII/unicode digit from it (e.g. '0'), then you'll be left with the digit's corresponding value (hence, 1)

    Because char is the same as short (although, an unsigned short), you can safely cast it to an int. And the casting is always done automatically if arithmetics are involved

    0 讨论(0)
  • 2020-11-27 05:28

    I will echo what @Mark Peters has said above in case people overlook his comment.

    As I quote: " Don't make the mistake of thinking that '0' == 0. In reality, '0' == 48 "

    0 讨论(0)
提交回复
热议问题