SPARK/SQL:spark can't resolve symbol toDF

前端 未结 2 586
旧时难觅i
旧时难觅i 2021-02-19 09:56

In my project, my external library is spark-assembly-1.3.1-hadoop2.6.0, if I press \'.\', the IDE inform me toDF(), but it inform me that can\'t resolv

相关标签:
2条回答
  • 2021-02-19 10:10

    To be able to to use toDF you have to import sqlContext.implicits first:

    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    import sqlContext.implicits._
    
    case class Foobar(foo: String, bar: Integer)
    
    val foobarRdd = sc.parallelize(("foo", 1) :: ("bar", 2) :: ("baz", -1) :: Nil).
        map { case (foo, bar) => Foobar(foo, bar) } 
    
    val foobarDf = foobarRdd.toDF
    foobarDf.limit(1).show
    
    0 讨论(0)
  • 2021-02-19 10:16

    This is a very late response to the question but just for the sake of people who are still looking for the answer:

    Try the same command on Spark 1.6 it will work.

    I was facing the same issue and searched in google and didn't get solution and then I upgraded Spark from 1.5 to 1.6 and it worked.

    If you don't know your Spark version:

    spark-submit --version (from command prompt)
    sc.version (from Scala Shell)
    
    0 讨论(0)
提交回复
热议问题