问题
From spark using:
DataFrame.write().mode(SaveMode.Ignore).format("orc").saveAsTable("myTableName")
Table is getting saved I can see using below command's hadoop fs -ls /apps/hive/warehouse\test.db'
where test
is my database name
drwxr-xr-x - psudhir hdfs 0 2016-01-04 05:02 /apps/hive/warehouse/test.db/myTableName
but when I trying to check tables in Hive I cannot view them either with command SHOW TABLES
from hiveContext.
回答1:
sudo cp /etc/hive/conf.dist/hive-site.xml /etc/spark/conf/
This worked for me in a Cloudera quick start Virtual Box.
回答2:
You have to copy the hive-site.xml file (mine is located at /etc/hive/conf.dist/hive-site.xml) to Spark conf folder (mine is located at /etc/spark/conf/)
sudo cp /etc/hive/conf.dist/hive-site.xml /etc/spark/conf/
Restart Spark and it should work.
回答3:
I think you need to run INVALIDATE METADATA;
in the hive console to refresh the databases and view your new table.
来源:https://stackoverflow.com/questions/34591704/hive-tables-are-created-from-spark-but-are-not-visible-in-hive