Why does importing SparkSession in spark-shell fail with “object SparkSession is not a member of package org.apache.spark.sql”?
问题 I use Spark 1.6.0 on my VM, Cloudera machine. I'm trying to enter some data into Hive table from Spark shell. To do that, I am trying to use SparkSession. But the below import is not working. scala> import org.apache.spark.sql.SparkSession <console>:33: error: object SparkSession is not a member of package org.apache.spark.sql import org.apache.spark.sql.SparkSession And without that, I cannot execute this statement: val spark = SparkSession.builder.master("local[2]").enableHiveSupport()