In C++, sizeof(\'a\') == sizeof(char) == 1
. This makes intuitive sense, since \'a\'
is a character literal, and sizeof(char) == 1
as d
I haven't seen a rationale for it (C char literals being int types), but here's something Stroustrup had to say about it (from Design and Evolution 11.2.1 - Fine-Grain Resolution):
In C, the type of a character literal such as
'a'
isint
. Surprisingly, giving'a'
typechar
in C++ doesn't cause any compatibility problems. Except for the pathological examplesizeof('a')
, every construct that can be expressed in both C and C++ gives the same result.
So for the most part, it should cause no problems.