In Javascript \'\\uXXXX\'
returns in a unicode character. But how can I get a unicode character when the XXXX
part is a variable?
For example:<
JavaScript uses UCS-2 internally.
Thus, String.fromCharCode(codePoint)
won’t work for supplementary Unicode characters. If codePoint
is 119558
(0x1D306
, for the '
Use String.fromCharCode() like this: String.fromCharCode(parseInt(input,16))
. When you put a Unicode value in a string using \u
, it is interpreted as a hexdecimal value, so you need to specify the base (16) when using parseInt
.
var hex = '2122';
var char = unescape('%u' + hex);
console.log(char);
will returns " ™ "
String.fromCharCode("0x" + input)
or
String.fromCharCode(parseInt(input, 16))
as they are 16bit numbers (UTF-16)
Since ES5 you can use
String.fromCodePoint(number)
to get unicode values bigger than 0xFFFF.
So, in every new browser, you can write it in this way:
var input = '2122';
console.log(String.fromCodePoint(input));
or if it is a hex number:
var input = '2122';
console.log(String.fromCodePoint(parseInt(input, 16)));
More info:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/fromCodePoint