Why does importing SparkSession in spark-shell fail with “object SparkSession is not a member of package org.apache.spark.sql”?

送分小仙女□ 提交于 2019-12-10 23:49:43

问题


I use Spark 1.6.0 on my VM, Cloudera machine.

I'm trying to enter some data into Hive table from Spark shell. To do that, I am trying to use SparkSession. But the below import is not working.

scala> import org.apache.spark.sql.SparkSession
<console>:33: error: object SparkSession is not a member of package org.apache.spark.sql
         import org.apache.spark.sql.SparkSession

And without that, I cannot execute this statement:

val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
<console>:33: error: not found: value SparkSession
         val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()

Can anyone tell me what mistake am I doing here ?


回答1:


SparkSession is available as of Spark 2.0 so you should be using SQLContext instead (or upgrade your Spark to the latest and greatest 2.1.1).

Quoting Spark 1.6.0's Starting Point: SQLContext:

The entry point into all functionality in Spark SQL is the SQLContext class, or one of its descendants.

In addition to the basic SQLContext, you can also create a HiveContext, which provides a superset of the functionality provided by the basic SQLContext.



来源:https://stackoverflow.com/questions/44772397/why-does-importing-sparksession-in-spark-shell-fail-with-object-sparksession-is

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!