Why does count return different types for Collection vs. Array?

后端 未结 2 1301
遥遥无期
遥遥无期 2021-02-19 09:34

When I\'m extending Collection the type of count is IndexDistance.

When I\'m extending Array type the count

相关标签:
2条回答
  • 2021-02-19 09:47

    From Associated Types in the Swift Programming Language (emphasis added):

    When defining a protocol, it’s sometimes useful to declare one or more associated types as part of the protocol’s definition. An associated type gives a placeholder name to a type that is used as part of the protocol. The actual type to use for that associated type isn’t specified until the protocol is adopted. Associated types are specified with the associatedtype keyword.

    In Swift 3/4.0, the Collection protocol defines five associated types (from What’s in a Collection?):

    protocol Collection: Indexable, Sequence {
        associatedtype Iterator: IteratorProtocol = IndexingIterator<Self>
        associatedtype SubSequence: IndexableBase, Sequence = Slice<Self>
        associatedtype Index: Comparable // declared in IndexableBase
        associatedtype IndexDistance: SignedInteger = Int
        associatedtype Indices: IndexableBase, Sequence = DefaultIndices<Self>
        ...
    }
    

    Here

        associatedtype IndexDistance: SignedInteger = Int
    

    is an associated type declaration with a type constraint (: SignedInteger) and a default value (= Int),

    If a type T adopts the protocol and does not define T.IndexDistance otherwise then T.IndexDistance becomes a type alias for Int. This is the case for many of the standard collection types (such as Array or String), but not for all. For example

    public struct AnyCollection<Element> : Collection
    

    from the Swift standard library defines

        public typealias IndexDistance = IntMax
    

    which you can verify with

    let ac = AnyCollection([1, 2, 3])
    let cnt = ac.count
    print(type(of: cnt)) // Int64
    

    You can also define your own collection type with a non-Int index distance if you like:

    struct MyCollection : Collection {
    
        typealias IndexDistance = Int16
        var startIndex: Int { return  0 }
        var endIndex: Int { return  3 }
    
        subscript(position: Int) -> String {
            return "\(position)"
        }
    
        func index(after i: Int) -> Int {
            return i + 1
        }
    }
    

    Therefore, if you extend the concrete type Array then count is an Int:

    extension Array {
        func whatever() {
            let cnt = count // type is `Int`
        }
    }
    

    But in a protocol extension method

    extension Collection {
        func whatever() {
            let cnt = count // some `SignedInteger`
        }
    }
    

    everything you know is that the type of cnt is some type adopting the SignedInteger protocol, but that need not be Int. One can still work with the count, of course. Actually the compiler error in

        for index in 0...count { //  binary operator '...' cannot be applied to operands of type 'Int' and 'Self.IndexDistance'
    

    is misleading. The integer literal 0 could be inferred as a Collection.IndexDistance from the context (because SignedInteger conforms to ExpressibleByIntegerLiteral). But a range of SignedInteger is not a Sequence, and that's why it fails to compile.

    So this would work, for example:

    extension Collection {
        func whatever() {
            for i in stride(from: 0, to: count, by: 1) {
                // ...
            }
        }
    }
    

    As of Swift 4.1, IndexDistance is no longer used, and the distance between collection indices is now always expressed as an Int, see

    • SE-0191 Eliminate IndexDistance from Collection

    In particular the return type of count is Int. There is a type alias

    typealias IndexDistance = Int
    

    to make older code compile, but that is remarked deprecated and will be removed in a future version of Swift.

    0 讨论(0)
  • 2021-02-19 10:03

    Not exactly an answer but being the OP I think these were all a vital prerequisite to my understanding. I did't know that:

    • You can constrain the associatedtype of a protocol
    • You can give the associatedtype a default type
    • Conformance to a protocol's associatedtype can be done through using a typealias.
    • Conformance to a protocol's associatedtype can be done through other ways as well ie though defaulting.
    • By design the default type of the associatedType isn't triggered 'at the protocol level' ie it only gets constrained to its constrained Type. However once a class/struct adopts it...then and only then the default type is used. For more refer to Martin's answer above and Apple docs on associatedtype
    • There is a third way of conformance to protocol's associatedtype. Please see the link provided at the end. Basically, you can conform by defining the associatedtype implicitly
    • Perhaps the most common way is to conform to a protocol with associated type through a generic constraint. See SomeClass9

    Three different Protocols

    // associatedtype isn't constrained
    protocol NotConstrained{
        associatedtype IndexDistance
    }
    
    // associatedtype is constrained
    protocol Constrained{
        associatedtype IndexDistance: SignedInteger
    }
    
    // associatedtype is constrained and defaulted
    protocol ConstrainedAndDefaulted{
        associatedtype IndexDistance: SignedInteger = Int
    }
    

    Conformance to the protocols

    // All Good
    class someClass1: NotConstrained{
        typealias IndexDistance = Int
    }
    
    // All Good
    class someClass2: NotConstrained{
        typealias IndexDistance = String // It works with String as well, since it wasn't constrained
    }
    
    // Not Good
    class SomeClass3: NotConstrained{
        // error: type 'SomeClass3' does not conform to protocol 'NotConstrained'
        // doesn't work because we MUST have a typealias
    }
    
    // All Good
    class SomeClass4: Constrained{
        typealias IndexDistance = Int16
    }
    
    // Not Good
    class SomeClass5: Constrained{
        typealias IndexDistance = String
        // error: type 'SomeClass5' does not conform to protocol 'Constrained'
        // Obviously! Because String isn't of type 'SignedIngeter'
    }
    
    // Not Good
    class SomeClass6: Constrained{
        // error: type 'SomeClass6' does not conform to protocol 'Constrained'        
    }
    
    // All Good
    class SomeClass7: ConstrainedAndDefaulted{
        // NO ERROR, because the associatedtype has already defaulted
    }
    
    // All Good
    class SomeClass8: ConstrainedAndDefaulted{
        typealias IndexDistance = Int64 // We changed the default from 'Int' to 'Int64'
        // Which is ok because 'Int64' is of type 'SignedInteger'
    }
    
    class SomeClass9<T> : NotConstrained {
        typealias IndexDistance = T
    }
    

    If you can understand why class SomeClass8 works without errors then you've got your answer!


    A very simple read can be found at here. I really like how the post defines the difference between implicit and explicit conformance to the protocol's associatedtypes

    EDIT:

    The Understanding protocol associated types and their constraints tutorial is a fantastic read.

    I will have to get back here and update my answer using the above tutorial. But until then refer to the link. It's really helpful.

    0 讨论(0)
提交回复
热议问题