Spark 2.0 missing spark implicits

后端 未结 2 1248
夕颜
夕颜 2020-12-25 12:27

Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row\'s into a dataframe of case classes. When I try to do so, Im greeted with a message stating to impo

相关标签:
2条回答
  • 2020-12-25 13:01

    Spark used spark identifier for SparkSession. This is what causes the confusion. If you created it with something like,

    val ss = SparkSession
      .builder()
      .appName("test")
      .master("local[2]")
      .getOrCreate()
    

    The correct way to import implicits would be,

    import ss.implicits._
    

    Let me know if this helps. Cheers.

    0 讨论(0)
  • 2020-12-25 13:10

    There is no package called spark.implicits.

    With spark here it refers to SparkSession. If you are inside the REPL the session is already defined as spark so you can just type:

    import spark.implicits._
    

    If you have defined your own SparkSession somewhere in your code, then adjust it accordingly:

    val mySpark = SparkSession
      .builder()
      .appName("Spark SQL basic example")
      .config("spark.some.config.option", "some-value")
      .getOrCreate()
    
    // For implicit conversions like converting RDDs to DataFrames
    import mySpark.implicits._
    
    0 讨论(0)
提交回复
热议问题