How do I convert a char
to an int
in C and C++?
int charToint(char a){
char *p = &a;
int k = atoi(p);
return k;
}
You can use this atoi method for converting char to int. For more information, you can refer to this http://www.cplusplus.com/reference/cstdlib/atoi/ , http://www.cplusplus.com/reference/string/stoi/.
(This answer addresses the C++ side of things, but the sign extension problem exists in C too.)
Handling all three char
types (signed
, unsigned
, and char
) is more delicate than it first appears. Values in the range 0 to SCHAR_MAX
(which is 127 for an 8-bit char
) are easy:
char c = somevalue;
signed char sc = c;
unsigned char uc = c;
int n = c;
But, when somevalue
is outside of that range, only going through unsigned char
gives you consistent results for the "same" char
values in all three types:
char c = somevalue;
signed char sc = c;
unsigned char uc = c;
// Might not be true: int(c) == int(sc) and int(c) == int(uc).
int nc = (unsigned char)c;
int nsc = (unsigned char)sc;
int nuc = (unsigned char)uc;
// Always true: nc == nsc and nc == nuc.
This is important when using functions from ctype.h, such as isupper
or toupper
, because of sign extension:
char c = negative_char; // Assuming CHAR_MIN < 0.
int n = c;
bool b = isupper(n); // Undefined behavior.
Note the conversion through int is implicit; this has the same UB:
char c = negative_char;
bool b = isupper(c);
To fix this, go through unsigned char
, which is easily done by wrapping ctype.h functions through safe_ctype:
template<int (&F)(int)>
int safe_ctype(unsigned char c) { return F(c); }
//...
char c = CHAR_MIN;
bool b = safe_ctype<isupper>(c); // No UB.
std::string s = "value that may contain negative chars; e.g. user input";
std::transform(s.begin(), s.end(), s.begin(), &safe_ctype<toupper>);
// Must wrap toupper to eliminate UB in this case, you can't cast
// to unsigned char because the function is called inside transform.
This works because any function taking any of the three char types can also take the other two char types. It leads to two functions which can handle any of the types:
int ord(char c) { return (unsigned char)c; }
char chr(int n) {
assert(0 <= n); // Or other error-/sanity-checking.
assert(n <= UCHAR_MAX);
return (unsigned char)n;
}
// Ord and chr are named to match similar functions in other languages
// and libraries.
ord(c)
always gives you a non-negative value – even when passed a negative char
or negative signed char
– and chr
takes any value ord
produces and gives back the exact same char
.
In practice, I would probably just cast through unsigned char
instead of using these, but they do succinctly wrap the cast, provide a convenient place to add error checking for int
-to-char
, and would be shorter and more clear when you need to use them several times in close proximity.
Well, in ASCII code, the numbers (digits) start from 48. All you need to do is:
int x = (int)character - 48;
Or, since the character '0' has the ASCII code of 48, you can just write:
int x = character - '0'; // The (int) cast is not necessary.
I have absolutely null
skills in C, but for a simple parsing:
char* something = "123456";
int number = parseInt(something);
...this worked for me:
int parseInt(char* chars)
{
int sum = 0;
int len = strlen(chars);
for (int x = 0; x < len; x++)
{
int n = chars[len - (x + 1)] - '0';
sum = sum + powInt(n, x);
}
return sum;
}
int powInt(int x, int y)
{
for (int i = 0; i < y; i++)
{
x *= 10;
}
return x;
}
Depends on what you want to do:
to read the value as an ascii code, you can write
char a = 'a';
int ia = (int)a;
/* note that the int cast is not necessary -- int ia = a would suffice */
to convert the character '0' -> 0
, '1' -> 1
, etc, you can write
char a = '4';
int ia = a - '0';
/* check here if ia is bounded by 0 and 9 */
Explanation:
a - '0'
is equivalent to ((int)a) - ((int)'0')
, which means the ascii values of the characters are subtracted from each other. Since 0
comes directly before 1
in the ascii table (and so on until 9
), the difference between the two gives the number that the character a
represents.
I was having problems converting a char array like "7c7c7d7d7d7d7c7c7c7d7d7d7d7c7c7c7c7c7c7d7d7c7c7c7c7d7c7d7d7d7c7c2e2e2e"
into its actual integer value that would be able to be represented by `7C' as one hexadecimal value. So, after cruising for help I created this, and thought it would be cool to share.
This separates the char string into its right integers, and may be helpful to more people than just me ;)
unsigned int* char2int(char *a, int len)
{
int i,u;
unsigned int *val = malloc(len*sizeof(unsigned long));
for(i=0,u=0;i<len;i++){
if(i%2==0){
if(a[i] <= 57)
val[u] = (a[i]-50)<<4;
else
val[u] = (a[i]-55)<<4;
}
else{
if(a[i] <= 57)
val[u] += (a[i]-50);
else
val[u] += (a[i]-55);
u++;
}
}
return val;
}
Hope it helps!