How to enable or disable Hive support in spark-shell through Spark property (Spark 1.6)?

孤街浪徒 提交于 2020-06-25 18:11:28

问题


Is there any configuration property we can set it to disable / enable Hive support through spark-shell explicitly in spark 1.6. I tried to get all the sqlContext configuration properties with,

sqlContext.getAllConfs.foreach(println)

But, I am not sure on which property can actually required to disable/enable hive support. or Is there any other way to do this?


回答1:


Spark >= 2.0

Enable and disable of Hive context is possible with config spark.sql.catalogImplementation

Possible values for spark.sql.catalogImplementation is in-memory or hive

SPARK-16013 Add option to disable HiveContext in spark-shell/pyspark


Spark < 2.0

Such a Spark property is not available in Spark 1.6.

One way to work it around is to remove Hive-related jars that would in turn disable Hive support in Spark (as Spark has Hive support when required Hive classes are available).




回答2:


You can enable hive support just by creating spark session but only in spark >=2.0:

val spark = SparkSession
  .builder()
  .appName("Spark Hive Example")
  .config("spark.sql.warehouse.dir", warehouseLocation)
  .enableHiveSupport()
  .getOrCreate()

And here you can read how to configure hive on spark by changing hive and spark properties in hive-site.xml, spark-defaults.conf: https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started - it must work with spark 1.6.1



来源:https://stackoverflow.com/questions/45209771/how-to-enable-or-disable-hive-support-in-spark-shell-through-spark-property-spa

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!