I need a file io library that can give my program a utf-16 (little endian) interface, but can handle files in other encodings, mainly ascii(input only), utf-8, utf-16, utf-32/uc
The problem you see comes from the linefeed conversion. Sadly, it is made at the byte level (after the code conversion) and is not aware of the encoding. IOWs, you have to disable the automatic conversion (by opening the file in binary mode, with the "b" flag) and, if you want 0A00 to be expanded to 0D00A00, you'll have to do it yourself.
You mention that you'd prefer a C++ wide-stream interface, so I'll outline what I did to achieve that in our software:
You can try the iconv (libiconv) library.
I think the problems come from the 0D 0A 00
linebreaks. You could try if other linebreaks like \r\n
or using LF or CR alone do work (best bet would be using \r
, I suppose)
EDIT: It seems 0D 00 0A 00
is what you want, so you can try
std::wstring str = L"Hello World in UTF-16!\15\12Another line.\15\12";
I successfully worked with the EZUTF library posted on CodeProject: High Performance Unicode Text File I/O Routines for C++
UTF8-CPP gives you conversion between UTF-8, 16 and 32. Very nice and light library.
About ICU, some comments by the UTF8-CPP creator :
ICU Library. It is very powerful, complete, feature-rich, mature, and widely used. Also big, intrusive, non-generic, and doesn't play well with the Standard Library. I definitelly recommend looking at ICU even if you don't plan to use it.
:)