问题
I am writing a method that takes an rdd and saves it as an avro file. The problem is that if I use a specific type than I can do .toDF()
but I cannot call .toDF()
on a generic rdd! Here is an example:
case class Person(name: String)
def f(x: RDD[Person]) = x.toDF()
def g[T](x: RDD[T]) = x.toDF()
f(p) //works
g(p) //fails!!
Does anyone know why I can't call .toDF()
on a generic rdd and if there is any way around it?
回答1:
If you are using Spark 2,
import org.apache.spark.sql.Encoder
def g[T: Encoder](x: RDD[T]) = x.toDF()
will work.
toDF
is the added method by implicit conversion
implicit def rddToDatasetHolder[T : Encoder](rdd: RDD[T]): DatasetHolder[T] = {
DatasetHolder(_sqlContext.createDataset(rdd))
}
in org.apache.spark.sql.SQLImplicits
To accomplish, the signature should be the same.
回答2:
import org.apache.spark.sql.Encoder
def g[T: Encoder](x: RDD[T]) = x.toDF()
is right and you should use the method by this: `
somefunc{rdd =>
val spark = SparkSession.builder.config(rdd.sparkContext.getConf).getOrCreate()
import spark.implicits._
g(rdd)
}
`
来源:https://stackoverflow.com/questions/45517200/how-to-convert-generic-rdd-to-dataframe