This code compiled OK under VS2013:
std::string Unicode::utf16_to_utf8(std::u16string utf16_string)
{
std::wstring_convert<std::codecvt_utf8_utf16<char16_t>, char16_t> convert;
return convert.to_bytes(utf16_string);
}
Now with VS2015 I get:
1>unicode.obj : error LNK2001: unresolved external symbol "__declspec(dllimport) public: static class std::locale::id std::codecvt<char16_t,char,struct _Mbstatet>::id" (__imp_?id@?$codecvt@_SDU_Mbstatet@@@std@@2V0locale@2@A)
Old question, but for future reference: this is a known bug in Visual Studio 2015, as explained in the latest post (January 7th 2016) in this thread of MSDN Social.
The workaround for your example looks like this (I implemented your method as a free function for simplicity):
#include <codecvt>
#include <locale>
#include <string>
#include <iostream>
#if _MSC_VER >= 1900
std::string utf16_to_utf8(std::u16string utf16_string)
{
std::wstring_convert<std::codecvt_utf8_utf16<int16_t>, int16_t> convert;
auto p = reinterpret_cast<const int16_t *>(utf16_string.data());
return convert.to_bytes(p, p + utf16_string.size());
}
#else
std::string utf16_to_utf8(std::u16string utf16_string)
{
std::wstring_convert<std::codecvt_utf8_utf16<char16_t>, char16_t> convert;
return convert.to_bytes(utf16_string);
}
#endif
int main()
{
std::cout << utf16_to_utf8(u"Élémentaire, mon cher Watson!") << std::endl;
return 0;
}
Hopefully, the problem will be fixed in future releases, otherwise the #if
condition will need refining.
UPDATE: nope, not fixed in VS 2017. Therefore, I've updated the preprocessor conditional to >= 1900
(initially was == 1900
).
Define the missing symbol in a cpp file.
// Apparently Microsoft forgot to define a symbol for codecvt.
// Works with /MT only
#include <locale>
#if (!_DLL) && (_MSC_VER >= 1900 /* VS 2015*/) && (_MSC_VER <= 1911 /* VS 2017 */)
std::locale::id std::codecvt<char16_t, char, _Mbstatet>::id;
#endif
This worked for me in VS2017:
std::wstring utf8_to_utf16(std::string utf8_string)
{
return std::wstring_convert<std::codecvt_utf8_utf16<wchar_t>, wchar_t>{}.from_bytes(utf8_string);
}
std::string utf16_to_utf8(std::wstring utf16_string)
{
return std::wstring_convert<std::codecvt_utf8_utf16<wchar_t>, wchar_t>{}.to_bytes(utf16_string);
}
Another possible workaround is to use default second template parameter (wchar_t) for wstring_convert. It is working for "MS Visual Studio 2015 update 3". Please note that it is not platform-independent solution. Windows only.
std::string utf16_to_utf8(std::u16string u16_string)
{
std::wstring wide_string(u16_string.begin(), u16_string.end());
std::wstring_convert<std::codecvt_utf8_utf16<wchar_t>> convert;
return convert.to_bytes(wide_string);
}
来源:https://stackoverflow.com/questions/32055357/visual-studio-c-2015-stdcodecvt-with-char16-t-or-char32-t