I am just learning to code so I decided to start with Swift. I am following the tour that mac has for it at here and I am at the section where it is calculating a sum of num
I’m guessing you’ve called your function with no arguments, that is:
averageOf()
This is allowed with variadic arguments, and numbers
will be an empty array. This will result in you attempting to divide an unchanged sum by an unchanged total (because you will go round the loop no times for no elements in numbers
), so dividing 0
by 0
, and you’re getting a divide-by-zero error.
To prevent this from being a possibility, you could require the user to supply at least one number:
func averageOf(first: Int, rest: Int...) -> Double {
var sum = first
var total = 1.0
for number in rest {
sum += number
total++
}
return Double(sum)/total
}
This way, if you try to call it with no arguments, you’ll get a compiler error.
BTW I altered your version to return a Double
rather than an Int
, you might want to experiment with the two versions to see why.
(this technique is similar to how the standard lib max
function is declared, which requires at least 2 arguments:
func max<T : Comparable>(x: T, y: T) -> T
but has an overloaded version for 3 or more:
func max<T : Comparable>(x: T, y: T, z: T, rest: T...) -> T
the reason for the first version instead of cutting straight to a variadic version that takes at least two being, you can then pass it into things like reduce
to find the max in a collection e.g. reduce(a, 0, max)
)
For me this error happened because an implicitly unwrapped property was not set. Setting it would fix the issue.