Weird behavior of & function in Set

后端 未结 2 1386
北海茫月
北海茫月 2020-12-18 00:32

Set is defined as Set[A]. It takes a in-variant parameter. Doing below works as expected as we are passing co-variant argument:

         


        
相关标签:
2条回答
  • 2020-12-18 00:52

    Not sure, but I think what you're looking for is described in the language spec under "Local Type Inference" (at this time of writing, section 6.26.4 on page 100).

    Local type inference infers type arguments to be passed to expressions of polymorphic type. Say e is of type [ a1 >: L1 <: U1, ..., an >: Ln <: Un ] T and no explicit type parameters are given.

    Local type inference converts this expression to a type application e [ T1, ..., Tn ]. The choice of the type arguments T1, ..., Tn depends on the context in which the expression appears and on the expected type pt. There are three cases.

    [ ... ]

    If the expression e appears as a value without being applied to value arguments, the type arguments are inferred by solving a constraint system which relates the expression's type T with the expected type pt. Without loss of generality we can assume that T is a value type; if it is a method type we apply eta-expansion to convert it to a function type. Solving means finding a substitution σ of types Ti for the type parameters ai such that

    • None of inferred types Ti is a singleton type

    • All type parameter bounds are respected, i.e. σ Li <: σ ai and σ ai <: σ Ui for i = 1, ..., n.

    • The expression's type conforms to the expected type, i.e. σ T <: σ pt.

    It is a compile time error if no such substitution exists. If several substitutions exist, local-type inference will choose for each type variable ai a minimal or maximal type Ti of the solution space. A maximal type Ti will be chosen if the type parameter ai appears contravariantly in the type T of the expression. A minimal type Ti will be chosen in all other situations, i.e. if the variable appears covariantly, nonvariantly or not at all in the type T. We call such a substitution an optimal solution of the given constraint system for the type T.

    In short: Scalac has to choose values for the generic types that you omitted, and it picks the most specific choices possible, under the constraint that the result compiles.

    0 讨论(0)
  • 2020-12-18 01:03

    The expression Set("hi") can be either a scala.collection.immutable.Set[String] or a scala.collection.immutable.Set[Object], depending on what the context requires. (A String is a valid Object, of course.) When you write this:

    Set(new Object) & Set("hi")
    

    the context requires Set[Object], so that's the type that's inferred; but when you write this:

    val b = Set("hi")
    

    the context doesn't specify, so the more-specific type Set[String] is chosen, which (as you expected) then makes a & b be ill-typed.

    0 讨论(0)
提交回复
热议问题