Tested this from the reference: https://developer.apple.com/documentation/swift
var string = String(count: 5, repeatedValue: \"a\")
// string is \"aaaaa\"
It seems that you have to explicitly pass in a Character type to it to function. This works for me.
let char = Character("a")
let string = String(count: 5, repeatedValue: char)
Although, there may be bug mixed in with all this as well. I believe the way you were doing this should have worked on its own. And I can't seem to get code completion on this initializer at all.
Edit: I'm going with bug. The following compiles just fine.
let array = Array(count: 5, repeatedValue: "a")
For the benefit of future searchers: as of Swift 3, use init(repeating:count:)
.
let sososo = String(repeating: "so", count: 3)
I know this is an old question and already has an answer. However I think I know why String(count: 5, repeatedValue: "a")
does not work.
The thing is String
has two similar looking initialisers:
init(count: Int, repeatedValue: Character)
init(count: Int, repeatedValue: UnicodeScalar)
So in this case compiler can't tell whether a literal is a Character
or UnicodeScalar
, hence compile time error if you don't pass explicit Character
. To confirm that "a"
can be interpreted as UnicodeScalar
you can check that this line compiles:
let a: UnicodeScalar = "a"
For anyone in swift 3.x its now something like this this will work like a charm.
var string = String(repeating: "a", count: 5)
This works just fine :
var str9 = String(count: 5,repeatedValue: Character("c"))
Swift 3:
var array = Array(repeating: 0, count: 5)
Output: [0, 0, 0, 0, 0]