I\'ve written spark job:
object SimpleApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName(\"Simple Application\").setMaster(\"l
You have the same error if you add sqlContext.implicits._
and spark.implicits._
in SimpleApp
(the order doesn't matter).
Removing one or the other will be the solution:
val spark = SparkSession
.builder()
.getOrCreate()
val sqlContext = spark.sqlContext
import sqlContext.implicits._ //sqlContext OR spark implicits
//import spark.implicits._ //sqlContext OR spark implicits
case class Person(age: Long, city: String)
val persons = ctx.read.json("/tmp/persons.json").as[Person]
Tested with Spark 2.1.0
The funny thing is if you add the same object implicits twice you will not have problems.
The error message says that the Encoder
is not able to take the Person
case class.
Error:(15, 67) Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing sqlContext.implicits._ Support for serializing other types will be added in future releases.
Move the declaration of the case class outside the scope of SimpleApp
.
@Milad Khajavi
Define Person case classes outside object SimpleApp. Also, add import sqlContext.implicits._ inside main() function.