问题
Is there any configuration property we can set it to disable / enable Hive support through spark-shell explicitly in spark 1.6. I tried to get all the sqlContext configuration properties with,
sqlContext.getAllConfs.foreach(println)
But, I am not sure on which property can actually required to disable/enable hive support. or Is there any other way to do this?
回答1:
Spark >= 2.0
Enable and disable of Hive context is possible with config
spark.sql.catalogImplementation
Possible values for
spark.sql.catalogImplementation
is in-memory or hiveSPARK-16013 Add option to disable HiveContext in spark-shell/pyspark
Spark < 2.0
Such a Spark property is not available in Spark 1.6.
One way to work it around is to remove Hive-related jars that would in turn disable Hive support in Spark (as Spark has Hive support when required Hive classes are available).
回答2:
You can enable hive support just by creating spark session but only in spark >=2.0:
val spark = SparkSession
.builder()
.appName("Spark Hive Example")
.config("spark.sql.warehouse.dir", warehouseLocation)
.enableHiveSupport()
.getOrCreate()
And here you can read how to configure hive on spark by changing hive and spark properties in hive-site.xml, spark-defaults.conf: https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started - it must work with spark 1.6.1
来源:https://stackoverflow.com/questions/45209771/how-to-enable-or-disable-hive-support-in-spark-shell-through-spark-property-spa