Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row\'s into a dataframe of case classes. When I try to do so, Im greeted with a message stating to impo
There is no package called spark.implicits
.
With spark
here it refers to SparkSession. If you are inside the REPL the session is already defined as spark
so you can just type:
import spark.implicits._
If you have defined your own SparkSession
somewhere in your code, then adjust it accordingly:
val mySpark = SparkSession
.builder()
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
// For implicit conversions like converting RDDs to DataFrames
import mySpark.implicits._