问题
What is the difference between String.prototype.codePointAt()
and String.prototype.charCodeAt()
in JavaScript?
'A'.codePointAt(); // 65
'A'.charCodeAt(); // 65
回答1:
From Mozilla:
The charCodeAt() method returns an integer between 0 and 65535 representing the UTF-16 code unit at the given index (the UTF-16 code unit matches the Unicode code point for code points representable in a single UTF-16 code unit, but might also be the first code unit of a surrogate pair for code points not representable in a single UTF-16 code unit, e.g. Unicode code points > 0x10000). If you want the entire code point value, use codePointAt().
charCodeAt()
is UTF-16, codePointAt()
is Unicode.
回答2:
To add a few for the ToxicTeacakes's answer, here is another example to help you know the difference:
"𠮷".charCodeAt(0).toString(16);//d842
"𠮷".charCodeAt(1).toString(16);//dfb7
"𠮷".codePointAt(0);//20bb7
"𠮷".codePointAt(1);//dfb7
console.log("\ud842\udfb7");//𠮷, an example of hexadecimal digits
console.log("\u20bb7\udfb7");//₻7�
console.log("\u{20bb7}");//𠮷 an unicode code point escapes the "\ud842\udfb7"
The following is the info about javascript string literals:
"\uXXXX"
The Unicode character specified by the four hexadecimal digits XXXX. For example, \u00A9 is the Unicode sequence for the copyright symbol."\u{XXXXX}"
Unicode code point
escapes. For example, \u{2F804} is the same as the simple Unicode escapes \uD87E\uDC04.
see also msdn
回答3:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/codePointAt
from this url, you can get the differences, their function is almost the same, but some differences on the returns and illegal argument
来源:https://stackoverflow.com/questions/36527642/difference-between-codepointat-and-charcodeat