.NET System::String to UTF8-bytes stored in char*

天涯浪子 提交于 2020-01-20 06:23:03

问题


I am wrapping some unmanaged C++ code inside a .NET project. For this I need to convert System::String to UTF8-bytes stored in char*.

I am unsure if this is the best or even a correct way to do this and I'd appreciate if someone could take a look and provide feedback.

Thanks,

/David

// Copy into blank VisualStudio C++/CLR command line solution.
#include "stdafx.h"
#include <stdio.h>

using namespace System;
using namespace System::Text;
using namespace System::Runtime::InteropServices;

// Test for calling with char* argument.
void MyTest(const char* buffer)
{
    printf_s("%s\n", buffer);
    return;
}

int main()
{

   // Create a UTF-8 encoding.
   UTF8Encoding^ utf8 = gcnew UTF8Encoding;

   // A Unicode string with two characters outside an 8-bit code range.
   String^ unicodeString = L"This unicode string contains two characters with codes outside an 8-bit code range, Pi (\u03a0) and Sigma (\u03a3).";
   Console::WriteLine(unicodeString);

   // Encode the string.
   array<Byte>^encodedBytes = utf8->GetBytes(unicodeString);

   // Get pointer to unmanaged char array
   int size = Marshal::SizeOf(encodedBytes[0]) * encodedBytes->Length;
   IntPtr pnt = Marshal::AllocHGlobal(size);
   Marshal::Copy(encodedBytes, 0, pnt, encodedBytes->Length);

   // Ugly, but necessary?
   char *charPnt= (char *)pnt.ToPointer();
   MyTest(charPnt);
   Marshal::FreeHGlobal(pnt);

}

回答1:


  1. You don't need to create an encoder instance, you can use the static instances.

  2. If the called function doesn't expect a pointer to the HGlobal heap you can just use plain C/C++ memory allocation (new or malloc) for the buffer.

  3. In your example the function doesn't take ownership so you don't need a copy at all, just pin the buffer.

Something like:

// Encode the text as UTF8
array<Byte>^ encodedBytes = Encoding::UTF8->GetBytes(unicodeString);

// prevent GC moving the bytes around while this variable is on the stack
pin_ptr<Byte> pinnedBytes = &encodedBytes[0];

// Call the function, typecast from byte* -> char* is required
MyTest(reinterpret_cast<char*>(pinnedBytes), encodedBytes->Length);

Or if you need the string zero-terminated like most C functions (including the example in the OP) then you should probably add a zero byte.

// Encode the text as UTF8, making sure the array is zero terminated
array<Byte>^ encodedBytes = Encoding::UTF8->GetBytes(unicodeString + "\0");

// prevent GC moving the bytes around while this variable is on the stack
pin_ptr<Byte> pinnedBytes = &encodedBytes[0];

// Call the function, typecast from byte* -> char* is required
MyTest(reinterpret_cast<char*>(pinnedBytes));


来源:https://stackoverflow.com/questions/6596242/net-systemstring-to-utf8-bytes-stored-in-char

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!