In my project, my external library is spark-assembly-1.3.1-hadoop2.6.0
, if I press \'.\', the IDE inform me toDF()
, but it inform me that can\'t resolv
To be able to to use toDF
you have to import sqlContext.implicits
first:
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
case class Foobar(foo: String, bar: Integer)
val foobarRdd = sc.parallelize(("foo", 1) :: ("bar", 2) :: ("baz", -1) :: Nil).
map { case (foo, bar) => Foobar(foo, bar) }
val foobarDf = foobarRdd.toDF
foobarDf.limit(1).show
This is a very late response to the question but just for the sake of people who are still looking for the answer:
Try the same command on Spark 1.6 it will work.
I was facing the same issue and searched in google and didn't get solution and then I upgraded Spark from 1.5 to 1.6 and it worked.
If you don't know your Spark version:
spark-submit --version (from command prompt)
sc.version (from Scala Shell)