Apple\'s String Format Specifiers document claims,
The format specifiers supported by the NSString formatting methods and CFString formatting function
On Mac OS X, <machine/_types.h>
defines wchar_t
as int
, so it's four bytes (32 bits) on all currently-supported architectures.
As you note, the printf
(3) manpage defines %S
as equivalent to %ls
, which takes a pointer to some wchar_t
characters (wchar_t *
).
The Cocoa documentation you linked to (and its CF equivalent), however, does define %S
separately:
%S
: Null-terminated array of 16-bit Unicode characters
Emphasis added. Also, the same goes for %C
.
So, this is not a bug. CF and Cocoa interpret %S
and %C
differently from how printf
and its cousins interpret them. CF and Cocoa treat the character(s) as UTF-16, whereas printf
(presumably) treats them as UTF-32.
The CF/Cocoa interpretation is more useful when working with Core Services, as some APIs (such as the File Manager) will hand you text as an array of UniChar
s, not a CFString; as long as you null-terminate that array, you can use it with %S
to print the string.