I am working with Apple\'s ScriptingBridge
framework, and have generated a header file for iTunes that contains several enum
s like this:
As already stated, those are integers declared using character constants.
When an integer is declared using a character constant of more than one character, it is sensitive to the byte order of the machine for which the constant was developed. As all the original Mac APIs were on PPC or earlier machines, they are backwards with respect to Intel Little-Endian machines.
If you are only building for Intel you can just reverse the order by hand.
If you are building a Universal binary you need to use a flipping function such as CFSwapInt32BigToHost.
Failure to correct those codes will leave you with code that could only work on PowerPC machines, regardless of the lack of compiler errors.
The single quotes indicate characters, rather than strings in C. So each of the enums will have a 32 bit value consisting of the character codes for the four characters. Actual value will depend in character encodings, but I am assuming 8 bit characters. Note there is no appended \0.
You can use the enums for normal comparison/assignment purposes. As with any enum the underlying type is integer.
I've used this technique in embedded systems many times to create 4 character 'names' that were human readable in hex dump/debugger contexts.
That is an Apple extension to C, which It basically translates those enums to:
typedef enum {
iTunesESrcLibrary = 'k'<<24 | 'L'<<16 | 'i'<<8 | 'b',
...
}
EDIT: Sorry, apparently it's valid C. I've only seem them in Mac code, so wrongly assumed that it was Apple specific.
C99, TC3 reads:
6.4.4.4 §2:
An integer character constant is a sequence of one or more multibyte characters enclosed in single-quotes, as in 'x'. [...]
6.4.4.4 §10:
An integer character constant has type int. The value of an integer character constant containing a single character that maps to a single-byte execution character is the numerical value of the representation of the mapped character interpreted as an integer. The value of an integer character constant containing more than one character (e.g., 'ab'), or containing a character or escape sequence that does not map to a single-byte execution character, is implementation-defined. If an integer character constant contains a single character or escape sequence, its value is the one that results when an object with type char whose value is that of the single character or escape sequence is converted to type int.
In most implementations, it's safe to use integer character constants of up to 4 one-byte characters. The actual value might differ between different systems (endianness?) though.
This is actually already defined in the ANSI-C89 standard, section 3.1.3.4:
An integer character constant is a sequence of one or more multibyte characters enclosed in single-quotes, as in 'x' or 'ab'. [...]
An integer character constant has type int. The value of an integer character constant containing a single character that maps into a member of the basic execution character set is the numerical value of the representation of the mapped character interpreted as an integer. The value of an integer character constant containing more than one character, or containing a character or escape sequence not represented in the basic execution character set, is implementation-defined. In particular, in an implementation in which type char has the same range of values as signed char, the high-order bit position of a single-character integer character constant is treated as a sign bit.