Here\'s an interesting one. I\'m writing an AES encryption algorithm, and have managed to get it making accurate encryptions. The trouble comes when I attempt to write the r
std::string
is nothing more or less than a specialization of the std::basic_string<>
template, so you can simply do a
typedef std::basic_string<unsigned char> ustring;
to get what you want.
Note that the C/C++ standards do not define whether char
is the signed or the unsigned variety, so any program that casts a char
directly to a larger type invokes implementation defined behaviour.
std::string
is really just a typedef, something like:
namespace std {
typedef basic_string<char> string;
}
It's fairly easy to create a variant for unsigned char
:
typedef basic_string<unsigned char> ustring;
You will, however, have to change your code to use a ustring
(or whatever name you prefer) instead of std::string
though.
Depending on how you've written your code, that may not require editing all the code though. In particular, if you have something like:
namespace crypto {
using std::string;
class AES {
string data;
// ..
};
}
You can change the string type by changing only the using
declaration:
namespace unsigned_types {
typedef std::basic_string<unsigned char> string;
}
// ...
namespace crypto {
using unsigned_types::string;
class AES {
string data;
};
}
Also note that different instantiations of a template are entirely separate types, even when the types over which they're intantiated are related, so the fact that you can convert implicitly between char
and unsigned char
doesn't mean you'll get a matching implicit conversion between basic_string<char>
and basic_string<unsigned char>
.
Cast your value to unsigned char
first:
char input = 250; // just an example
unsigned int n = static_cast<unsigned char>(input); // NOT: "unsigned int n = input;"
// ^^^^^^^^^^^^^^^^^^^^^^^^^^
The problem is that your char
happens to be signed, and so its value is not the "byte value" that you want -- you have to convert to unsigned char
to get that.