问题
I have a group of types that each have their own type member:
sealed trait FieldType {
type Data
def parse(in: String): Option[Data]
}
object Name extends FieldType {
type Data = String
def parse(in: String) = Some(in)
}
object Age extends FieldType {
type Data = Int
def parse(in: String) = try { Some(in.toInt) } catch { case _ => None }
}
And I have a group of types that operate on sets of the FieldType
s (using boilerplate rather than abstracting over arity):
sealed trait Schema {
type Schema <: Product
type Data <: Product
val schema: Schema
def read(in: Seq[String]): Option[Data]
}
trait Schema1 extends Schema {
type D1
type FT1 <: FieldType { type Data = D1 }
type Schema = Tuple1[FT1]
type Data = Tuple1[D1]
def read(in: Seq[String]) = schema._1.parse(in(0)).map(Tuple1.apply)
}
trait Schema2 extends Schema {
type D1
type D2
type FT1 <: FieldType { type Data = D1 }
type FT2 <: FieldType { type Data = D2 }
type Schema = (FT1, FT2)
type Data = (D1, D2)
def read(in: Seq[String]) = {
for {
f <- schema._1.parse(in(0))
s <- schema._2.parse(in(1))
} yield (f, s)
}
}
I thought I could use this system to elegantly define sets of fields that are meaningful because scala would be able to infer the type members:
class Person extends Schema2 {
val schema = (Name, Age)
}
However, this doesn't compile! I have to include definitions for all the type members:
class Person extends Schema2 {
type D1 = String; type D2 = Int
type FT1 = Name.type; type FT2 = Age.type
val schema = (Name, Age)
}
How come scala can't infer D1,... and FT1,...? How can I refactor this so I don't have to specify the type variables in Person
?
Note: Once I have a better understanding of macros, I plan to use them for the Schema
types. Also, I'd rather not use shapeless. It's a great library, but I don't want to pull it in to solve this one problem.
回答1:
By declaring this:
val schema: Schema
you specify that schema
must of type Schema
or any of its subtypes. Hence, knowing the type of schema
, you cannot infer Schema
because it could be any supertype of schema.type
.
You can solve your problem by reversing completely the thing: define the type alias in terms of schema.type
:
trait Schema2 extends Schema {
type Schema = (FieldType, FieldType)
type FT1 = schema._1.type
type FT2 = schema._2.type
type D1 = FT1#Data
type D2 = FT2#Data
type Data = (D1, D2)
def read(in: Seq[String]) = {
for {
f <- schema._1.parse(in(0))
s <- schema._2.parse(in(1))
} yield (f, s)
}
}
(Not sure it will actually work, but in theory this should typecheck.)
来源:https://stackoverflow.com/questions/24983882/why-doesnt-scala-infer-the-type-members-of-an-inherited-trait