Why is the error “Unable to find encoder for type stored in a Dataset” when encoding JSON using case classes?

前端 未结 3 1198
借酒劲吻你
借酒劲吻你 2020-12-09 04:04

I\'ve written spark job:

object SimpleApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName(\"Simple Application\").setMaster(\"l         


        
3条回答
  •  醉梦人生
    2020-12-09 04:42

    You have the same error if you add sqlContext.implicits._ and spark.implicits._ in SimpleApp (the order doesn't matter).

    Removing one or the other will be the solution:

    val spark = SparkSession
      .builder()
      .getOrCreate()
    
    val sqlContext = spark.sqlContext
    import sqlContext.implicits._ //sqlContext OR spark implicits
    //import spark.implicits._ //sqlContext OR spark implicits
    
    case class Person(age: Long, city: String)
    val persons = ctx.read.json("/tmp/persons.json").as[Person]
    

    Tested with Spark 2.1.0

    The funny thing is if you add the same object implicits twice you will not have problems.

提交回复
热议问题