Put simply, Java is willing to implicitly convert a char
to an int
.
It will convert it to a 16-bit Unicode value.
If the input were 'A' you will get '65' as your output.
It's arguable (by me!) that characters and integers are sufficiently different that the language shouldn't be so sloppy as from time to time it can lead to surprise behaviour.
If you want chapter and verse look at Sec 5.1.2 here:
https://docs.oracle.com/javase/specs/jls/se7/html/jls-5.html