Extension of constructed generic type in Swift

前端 未结 7 704
闹比i
闹比i 2020-12-02 13:56

Is it possible to extend an generic class for a specialised/constructed generic type? I would like to extend Int Arrays with a method to calculate the sum of its elements.

相关标签:
7条回答
  • 2020-12-02 14:37

    you can do it as well

    extension Array {
        func sum () -> Int? {
            guard self.count > 0  && self.first is Int  else {
                return nil
            }
            var s = 0
            forEach {
                s += $0 as! Int
            }
            return s
        }
    }
    
    0 讨论(0)
  • 2020-12-02 14:38

    Managed to get something working in an extensible, generic fashion without abusing the type system too badly, however it has some limitations.

    protocol Addable {
        func +(lhs: Self, rhs: Self) -> Self
        class var identity: Self { get }
    }
    
    extension Int : Addable {
        static var identity: Int { get { return 0 } }
    }
    
    extension String : Addable {
        static var identity: String { get { return "" } }
    }
    
    extension Array {
        func sum<U : Addable>() -> U? {
            let s: U? = U.identity
            return self.sum(s)
        }
    
        func sum<U : Addable>(start: U?) -> U? {
            return reduce(start) { lhs, rhs in
                switch (lhs, rhs) {
                case (.Some(let left), let right as U):
                    return left + right
                default:
                    return nil
                }
            }
        }
    }
    

    Specifically: with this solution, type inferencing won't work on the no-parameter sum() method, so you have to either annotate the expected return type or give it a starting value (from which it can infer the type).

    Note also that this returns a value of Optional type: if for any reason a sum of the expected type cannot be computed from the array, it returns nil.

    To illustrate:

    let int_array = Array(1...10)
    
    let x: Int? = int_array.sum() // result: {Some 55}
    let x2 = int_array.sum(0) // result: {Some 55}
    let x3 = int_array.sum() // Compiler error because it can't infer type
    
    
    let string_array = ["a", "b", "c"]
    
    let y: String? = string_array.sum() // result: {Some "abc"}
    let y2 = string_array.sum("") // result: {Some "abc"}
    
    let y3: Int? = string_array.sum() // result: nil  (can't cast String to Int)
    let y4 = string_array.sum(0) // result: nil  (can't cast String to Int)
    
    
    let double_array = [1.3, 4.2, 2.1]
    
    let z = double_array.sum(0.0) // Compiler error because we haven't extended Double to be Addable
    
    0 讨论(0)
  • 2020-12-02 14:40

    Swift 5.x:

    extension Array where Element == Int {
    
        var sum: Int {
            reduce(0, +)
        }
    }
    
    0 讨论(0)
  • 2020-12-02 14:42

    Looks like you can't. The closest we can get is the function

    func sum(a:Array<Int>) -> Int {
        return a.reduce(0) {$0 + $1}
    }
    

    Swift will allow you to add extension on the Array class but not specifically to a specialized version of the class.

    error: <REPL>:108:1: error: non-nominal type 'Array<Int>' cannot be extended

    You can extend the Array class.

    extension Array {
    
        func sum() -> Int {
            return reduce(0) { $0 + $1 }
        }
    }
    

    The problem is now with the + operator

    error: <REPL>:102:16: error: could not find an overload for '+' that accepts the supplied arguments
            return reduce(0) { $0 + $1 }
    

    This is somewhat expected since we cannot be sure that the + operator will be will be overloaded for all the possible types that could be used in an array.

    So we could try to constraint the operation only on certain classes. Something like

    class Dummy {
    }
    
    extension Array {
        func someFunc<T:Dummy>() -> Int {
           return 0
        }
    }
    
    var l = [Dummy()]
    var r = l.someFunc() // Expect 0
    

    Conceptually this should work (currently it seems that there is a bug, Xcode crashes when evaluating a playground using this code). In the eventually that it works, we cannot use this trick since the type Int is not a class.

    extension Array {
        func sum<T:Int>() -> T {
            return reduce(0) { $0 + $1 }
        }
    }
    
    error: <REPL>:101:14: error: inheritance from non-protocol, non-class type 'Int'
        func sum<T:Int>() -> T {
    

    I also looked at extending the Array class with a protocol but again Int not being a class makes it impossible. If the numeric types were classes, it would be nice if we could have a protocol to define that a class can be added just like Comparable or Equatable but my understanding is that protocol cannot define generic function which would be needed to create a Addable protocol.

    Edit:

    As stated by other answers, you can make it work for Int by explicitly checking and casting to Int in the closure. I guess I missed it will investigating. But it would still be nice if we could have a generic way of working with numeric types.

    0 讨论(0)
  • 2020-12-02 14:47

    This can be achieved using protocol extensions (See The Swift Programming Language: Protocols for more information). In Swift 3:

    To sum just Ints you could do:

    extension Sequence where Iterator.Element == Int {
        var sum: Int {
            return reduce(0, +)
        }
    }
    

    Usage:

    let nums = [1, 2, 3, 4]
    print(nums.sum) // Prints: "10"
    

    Or, for something more generic you could what @Wes Campaigne suggested and create an Addable protocol:

    protocol Addable {
        init()
        func + (lhs: Self, rhs: Self) -> Self
    }
    
    extension Int   : Addable {}
    extension Double: Addable {}
    extension String: Addable {}
    ...
    

    Next, extend Sequence to add sequences of Addable elements:

    extension Sequence where Iterator.Element: Addable {
        var sum: Iterator.Element {
            return reduce(Iterator.Element(), +)
        }
    }
    

    Usage:

    let doubles = [1.0, 2.0, 3.0, 4.0]
    print(doubles.sum) // Prints: "10.0"
    
    let strings = ["a", "b", "c"]
    print(strings.sum) // Prints: "abc"
    
    0 讨论(0)
  • 2020-12-02 14:54

    Alexander,

    Here's how you can do it:

    extension Array {
        func sum() -> Int {
            return reduce(0) { ($0 as Int) + ($1 as Int) }
        }
    }
    

    Works like a charm, tested in the playground. However, you might get into trouble if you call this function on different types of arrays.

    0 讨论(0)
提交回复
热议问题