Spark 2.0 missing spark implicits

后端 未结 2 1247
夕颜
夕颜 2020-12-25 12:27

Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row\'s into a dataframe of case classes. When I try to do so, Im greeted with a message stating to impo

2条回答
  •  隐瞒了意图╮
    2020-12-25 13:10

    There is no package called spark.implicits.

    With spark here it refers to SparkSession. If you are inside the REPL the session is already defined as spark so you can just type:

    import spark.implicits._
    

    If you have defined your own SparkSession somewhere in your code, then adjust it accordingly:

    val mySpark = SparkSession
      .builder()
      .appName("Spark SQL basic example")
      .config("spark.some.config.option", "some-value")
      .getOrCreate()
    
    // For implicit conversions like converting RDDs to DataFrames
    import mySpark.implicits._
    

提交回复
热议问题