问题
Hi I'm Saving My dataframe as hive table using spark-sql.
mydf.write().format("orc").saveAsTable("myTableName")
I'm able to see that table is getting created using
hadoop fs -ls /apps/hive/warehouse\dbname.db
Also able to see data using spark-shell
spark.sql(use dbname)
spark.sql(show tables).show(false)
but same tables I'm not able to see using hive shell. I have place my hive-site.xml file using.
sudo cp /etc/hive/conf.dist/hive-site.xml /etc/spark/conf/
but still not able to see. can someone please guide my what else i need to do? I'm Using HDP.
回答1:
Your write statement is wrong, it should be:
mydf.write.format("orc").saveAsTable("myTableName")
org.apache.spark.sql.DataFrameWriter[org.apache.spark.sql.Row] does not take parameters
Also, while writing you should try and use "DBName.TableName" instead of just "TableName" as it is considered a good practice.
See if that helps.
来源:https://stackoverflow.com/questions/61006627/hive-table-getting-created-but-not-able-to-see-using-hive-shell