Just now I read \"char is the only unsigned integral primitive type in Java.\" Does this mean the char is one of the integral types in Java?
Same as in C, recently I
In memory, essentially everything is integral... but yes. char is an integer type in C, C++ and Java.
Yes a char type can be categorized as an integer:
int i = 49;
int k = 36;
System.out.print((char)i + (char)k + "$"); //may expect 1$ but print: 85$
Yes, a char
is an integral type in all the popular languages in which it appears. "Integral" means that its spectrum is discrete and the smallest difference between any two distinct values is 1
. The required range of supported values is usually quite small compared to that of other integral types. Computer hardware traditionally treats integers as the fundamental data type; by contrast, arithmetic floating-point types are a more recent and more complicated addition.
Char is an "Integer Data Type" in C and its related progeny. As per the K&R Book, "By definition, chars are just small integers". They are used for storing 'Character' data.
The ASCII Table lists 128 characters, and each text character corresponds to an integer value.
Char Data Type is a 1 Byte (8 Bits). Therefore, they can store upto 2^8 = 256 different integers. (These 256 different integers correspond to different ASCII or UTF characters.)
For example:
As per ASCII standard, the letter "x" is stored as 01111000 (decimal 120).
for e.g., you can Add a value in a char variable, just like any integer!
int main()
{
char a='x';
int b=3+a;
printf("value is:\n%d",b);
return 0;
}
output: b=123 (3 + 120 for the char 'x')
i.e. chars are just numbers (integers: 0-256 (unsigned)).
Similar to @aioobe's answer
int n = 5;
char ch = (char) '0' + 5; // '5'
or the reverse
char ch = '9';
int i = ch - '0'; // 9
or the unusual
char ch = 'a';
ch += 1; // 'b';
or the odd
char ch = '0';
ch *= 1.1; // '4' as (char) (48 * 1.1) == '4'
ch %= 16; // `\u0004`
int i = ch; // 4
BTW: From String.hashCode()
// int h = 0;
char val[] = value;
int len = count;
for (int i = 0; i < len; i++) {
h = 31*h + val[off++];
}
According to the Java Primitive Data Types tutorial:
char: The char data type is a single 16-bit Unicode character. It has a minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or 65,535 inclusive).
So yes, it is a 16-bit unsigned integer. Whether you use the type to represent a number or a character is up to you...
char
type is guaranteed to be 16 bits in Java, the only restriction C imposes is that the type must be at least 8 bits. According to the C spec reference from this answer:
maximum number of bits for smallest object that is not a bit-field (byte)
CHAR_BIT 8
So a char
in C does not necessarily represent the same range of integer values as a char
in Java.