Apache Spark 2.3.1 with Hive metastore 3.1.0

房东的猫 提交于 2019-12-05 14:41:15

Looks like this is a not implemented Spark feature. But the only one way to use Spark and Hive since 3.0 that I found is to use HiveWarehouseConnector from Horton. Documentation here. And good guide from Horton Community here. I leave the question unanswered until Spark developers have prepared an own solution.

I've got a bit of a throwback trick for this one although disclaimer, it bypasses the ranger permissions (don't blame me if you incur the wrath of an admin).

To use with the spark-shell

export HIVE_CONF_DIR=/usr/hdp/current/hive-client/conf
spark-shell --conf "spark.driver.extraClassPath=/usr/hdp/current/hive-client/conf"

To use with sparklyR

Sys.setenv(HIVE_CONF_DIR="/usr/hdp/current/hive-client/conf")
conf = spark_config()
conf$'sparklyr.shell.driver-class-path' = '/usr/hdp/current/hive-client/conf'

It should work for the thriftserver too but I have not tested.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!