Best way to convert an ASCII digit to its numeric value
问题 Lets say I have an ASCII character representing a decimal digit: char digit; // between 0x30 ('0') and 0x39 ('9') inclusive I want to get the numeric value it represents (between 0 and 9 ). I am aware of two possible methods: subtraction: uint8_t value = digit - '0'; bitwise and: uint8_t value = digit & 0x0f; Which one is the most efficient in terms of compiled code size? Execution time? Energy consumption? As the answer may be platform-specific, I am most interested about the kind of