I\'m running the following code in the viewDidLoad function of a vanilla iPad single view app:
/*
* Print the string. A lot.
*/
for (int i = 0; i < 300; i
Short version:
I think this happens if a UTF-8 sequence of an NSLog()
output happens to fall on the boundary of the buffer
of the pseudo-terminal that Xcode uses for standard error of the debugged process.
If my assumption is correct, this is only a problem of the Xcode debugger output and does not imply any Unicode problems in the application.
Long version:
If you run your app in the simulator, lsof -p
shows that the standard
error (file descriptor 2) is redirected to a pseudo-terminal:
# lsof -p 3251
...
testplay 3251 martin 2w CHR 16,2 0t131 905 /dev/ttys002
...
And lsof -p
shows that Xcode has the same pseudo-terminal open:
# lsof -p 3202
...
Xcode 3202 martin 51u CHR 16,2 0t0 905 /dev/ttys002
...
NSLog()
writes to standard error. With the system call tracer "dtruss" one can see
that Xcode reads the log message from the pseudo-terminal. For a single log message
NSLog(@"⊢ ⊣ ⊥ ⊻ ⊼ ⊂ ⊃ ⊑ ⊒ \n");
it looks like this:
# dtruss -n Xcode -t read_nocancel
3202/0xe101: read_nocancel(0x31, "2013-02-05 08:57:44.744 testplay[3251:11303] \342\212\242 \342\212\243 ... \342\212\222 \n\0", 0x8000) = 82 0
But for many NSLog()
statements following each other rapidly, sometimes the following happens:
# dtruss -n Xcode -t read_nocancel
...
3202/0xd828: read_nocancel(0x33, "2013-02-05 08:39:51.156 ...", 0x8000) = 1024 0
3202/0xd87b: read_nocancel(0x33, "\212\273 \342\212\274 ...", 0x8000) = 24 0
As you can see, Xcode has read 1024 bytes from the pseudo-terminal, and the next read starts with an incomplete UTF-8 sequence. In this case, Xcode "does not see" that the last byte of the first read and the first two bytes of the second read are parts of the same UTF-8 sequence. I assume that Xcode treats all 3 bytes as invalid UTF-8 sequences and prints them as octal numbers.