I just want to get the ASCII value of a single char string in Swift. This is how I\'m currently doing it:
var singleChar = \"a\"
println(singleChar.unicodeSc
var singchar = "a" as NSString
print(singchar.character(at: 0))
Swift 3.1
UnicodeScalar("1")!.value // returns 49
Swift 3.1
Now in Xcode 7.1 and Swift 2.1
var singleChar = "a"
singleChar.unicodeScalars.first?.value
Here's my implementation, it returns an array of the ASCII values.
extension String {
func asciiValueOfString() -> [UInt32] {
var retVal = [UInt32]()
for val in self.unicodeScalars where val.isASCII() {
retVal.append(UInt32(val))
}
return retVal
}
}
Note: Yes it's Swift 2 compatible.
With Swift 5, you can pick one of the following approaches in order to get the ASCII numeric representation of a character.
Character
's asciiValue
propertyCharacter
has a property called asciiValue. asciiValue
has the following declaration:
var asciiValue: UInt8? { get }
The ASCII encoding value of this character, if it is an ASCII character.
The following Playground sample codes show how to use asciiValue
in order to get
the ASCII encoding value of a character:
let character: Character = "a"
print(character.asciiValue) //prints: Optional(97)
let string = "a"
print(string.first?.asciiValue) //prints: Optional(97)
let character: Character = "
A slightly shorter way of doing this could be:
first(singleChar.unicodeScalars)!.value
As with the subscript version, this will crash if your string is actually empty, so if you’re not 100% sure, use the optional:
if let ascii = first(singleChar.unicodeScalars)?.value {
}
Or, if you want to be extra-paranoid,
if let char = first(singleChar.unicodeScalars) where char.isASCII() {
let ascii = char.value
}