Why does count return different types for Collection vs. Array?

后端 未结 2 1300
遥遥无期
遥遥无期 2021-02-19 09:34

When I\'m extending Collection the type of count is IndexDistance.

When I\'m extending Array type the count

2条回答
  •  既然无缘
    2021-02-19 10:03

    Not exactly an answer but being the OP I think these were all a vital prerequisite to my understanding. I did't know that:

    • You can constrain the associatedtype of a protocol
    • You can give the associatedtype a default type
    • Conformance to a protocol's associatedtype can be done through using a typealias.
    • Conformance to a protocol's associatedtype can be done through other ways as well ie though defaulting.
    • By design the default type of the associatedType isn't triggered 'at the protocol level' ie it only gets constrained to its constrained Type. However once a class/struct adopts it...then and only then the default type is used. For more refer to Martin's answer above and Apple docs on associatedtype
    • There is a third way of conformance to protocol's associatedtype. Please see the link provided at the end. Basically, you can conform by defining the associatedtype implicitly
    • Perhaps the most common way is to conform to a protocol with associated type through a generic constraint. See SomeClass9

    Three different Protocols

    // associatedtype isn't constrained
    protocol NotConstrained{
        associatedtype IndexDistance
    }
    
    // associatedtype is constrained
    protocol Constrained{
        associatedtype IndexDistance: SignedInteger
    }
    
    // associatedtype is constrained and defaulted
    protocol ConstrainedAndDefaulted{
        associatedtype IndexDistance: SignedInteger = Int
    }
    

    Conformance to the protocols

    // All Good
    class someClass1: NotConstrained{
        typealias IndexDistance = Int
    }
    
    // All Good
    class someClass2: NotConstrained{
        typealias IndexDistance = String // It works with String as well, since it wasn't constrained
    }
    
    // Not Good
    class SomeClass3: NotConstrained{
        // error: type 'SomeClass3' does not conform to protocol 'NotConstrained'
        // doesn't work because we MUST have a typealias
    }
    
    // All Good
    class SomeClass4: Constrained{
        typealias IndexDistance = Int16
    }
    
    // Not Good
    class SomeClass5: Constrained{
        typealias IndexDistance = String
        // error: type 'SomeClass5' does not conform to protocol 'Constrained'
        // Obviously! Because String isn't of type 'SignedIngeter'
    }
    
    // Not Good
    class SomeClass6: Constrained{
        // error: type 'SomeClass6' does not conform to protocol 'Constrained'        
    }
    
    // All Good
    class SomeClass7: ConstrainedAndDefaulted{
        // NO ERROR, because the associatedtype has already defaulted
    }
    
    // All Good
    class SomeClass8: ConstrainedAndDefaulted{
        typealias IndexDistance = Int64 // We changed the default from 'Int' to 'Int64'
        // Which is ok because 'Int64' is of type 'SignedInteger'
    }
    
    class SomeClass9 : NotConstrained {
        typealias IndexDistance = T
    }
    

    If you can understand why class SomeClass8 works without errors then you've got your answer!


    A very simple read can be found at here. I really like how the post defines the difference between implicit and explicit conformance to the protocol's associatedtypes

    EDIT:

    The Understanding protocol associated types and their constraints tutorial is a fantastic read.

    I will have to get back here and update my answer using the above tutorial. But until then refer to the link. It's really helpful.

提交回复
热议问题