问题
I am porting application from CentOS 6 to Mac OS X. It depends on iconv and works in CentOS normally. However, on Mac OS X it doesn't. I see following behavior:
const char *codePages[] = { "MAC", "LATIN1", "ISO_8859-1", "WINDOWS-1252", "ASCII" };
int codePagesCount = 5;
iconv_t converter1 = iconv_open("UTF-32", codePages[0]);// Works
if(converter1 != (iconv_t)-1)
iconv_close(converter1);
iconv_t converter2 = iconv_open("UTF−32", "MAC");// Fails, returns -1
if(converter2 != (iconv_t)-1)
iconv_close(converter2);
This piece of code looks trivial: the first iconv_open creates converter and gets code page name from codePages array, its zero element is MAC, so it is logical for me that Mac OS X must support conversion from its own code page to Unicode. And the first call to iconv_open works. However, the second call to iconv_open does the same. It also creates converter from Mac encoding to Unicode. And for any reason it fails and returns -1. What may be the reason of such situation when call to the same function with the same arguments (one is element of hardcoded array, another one is hardcoded string) results in normal functionality for the first call and failure for the second one?
回答1:
The second "UTF-32" is not the same as the first: the first one uses a plain minus sign whereas the second one uses an endash I guess.
来源:https://stackoverflow.com/questions/12726924/iconv-library-on-mac-os-x-strange-behavior