How to convert jstring
(JNI) to std::string
(c++) with utf8
characters?
this is my code. it worked with non-utf8 characters, but i
After a lot time to find solution. i was found a way:
In java, a unicode char will be encoded using 2 bytes (utf16
). so jstring
will container characters utf16
. std::string
in c++ is essentially a string of bytes, not characters, so if we want to pass jstring
from JNI
to c++
, we have convert utf16
to bytes.
in document JNI functions, we have 2 functions to get string from jstring:
// Returns a pointer to the array of Unicode characters of the string.
// This pointer is valid until ReleaseStringchars() is called.
const jchar * GetStringChars(JNIEnv *env, jstring string, jboolean *isCopy);
// Returns a pointer to an array of bytes representing the string
// in modified UTF-8 encoding. This array is valid until it is released
// by ReleaseStringUTFChars().
const char * GetStringUTFChars(JNIEnv *env, jstring string, jboolean *isCopy);
GetStringUTFChars
, it will return a modified utf8.
GetStringChars
will return jbyte *, we will read char code from jbytes and convert it to char in c++
this is my solution (worked well with ascii
and utf8
characters):
std::string jstring2string(JNIEnv *env, jstring jStr) {
if (!jStr)
return "";
const jclass stringClass = env->GetObjectClass(jStr);
const jmethodID getBytes = env->GetMethodID(stringClass, "getBytes", "(Ljava/lang/String;)[B");
const jbyteArray stringJbytes = (jbyteArray) env->CallObjectMethod(jStr, getBytes, env->NewStringUTF("UTF-8"));
size_t length = (size_t) env->GetArrayLength(stringJbytes);
jbyte* pBytes = env->GetByteArrayElements(stringJbytes, NULL);
std::string ret = std::string((char *)pBytes, length);
env->ReleaseByteArrayElements(stringJbytes, pBytes, JNI_ABORT);
env->DeleteLocalRef(stringJbytes);
env->DeleteLocalRef(stringClass);
return ret;
}