hive.HiveImport: FAILED: SemanticException [Error 10072]: Database does not exist:

佐手、 提交于 2019-12-09 04:10:30

Finally I got the answer by myself while reading the forum discussion about the same issue here.

The issue was with the Hive Metastore configuration, there are three types of Hive Metastore configurations

  1. Embedded Metastore (default metastore deployment mode).
  2. Locall Metastore.
  3. Remote Metastore.

and my Hive Metastore configuration was the default one. As mention in cloudera documentation of Configuring the Hive Metastore, I change the metastore configuration from Embedded (Default) to Remote Metastore and its start working for me.

for More information of Metastore configuration use following documentation of Cloudera.

Configuring the Hive Metastore

Sqoop Command

sqoop-import-all-tables --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --hive-database edgeowt --hive-overwrite -m 4

sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt.db --hive-overwrite

Change --hive-database edgewot.db to --hive-table edgewot.db, where edgewot is your hive database name and db is your hive tablename.

sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-table edgeowt.db  --hive-overwrite

NOTE: There is no --hive-database options in sqoop-1.4.5. Please refer table 8 Hive Arguments of Section 7.2.11 in http://sqoop.apache.org/docs/1.4.5/SqoopUserGuide.html

--ALTERNATE METHOD--

If sqoop import-all-tables fails, try the following steps:

1. Create a folder named hivetables in /usr/local (local filesystem). Change permissions for the folder sudo chmod -R 777 /usr/local/hivetables.

2. Create a shell script named sqoop-hive.sh in /usr/local/hivetables and change permissions for this file sudo chmod -R 777 /usr/local/hivetables/sqoop-hive.sh

3. Paste this in sqoop-hive.sh file:

#!/bin/sh
#tabfold="/usr/local/hivetables"
#cd $tabfold
mysql -u $1 -p$2 -N information_schema -e "select table_name from tables where table_schema = '$3'" > /usr/local/hivetables/tables.txt
HOSTFILE="/usr/local/hivetables/tables.txt"
for host in $(cat $HOSTFILE)
do
   $SQOOP_HOME/bin/sqoop import --connect jdbc:mysql://localhost:3306/$3 --table $host --username $1 --password $2 --hive-import --hive-table $4.$host --warehouse-dir /user/hive/warehouse
done

4. Execute the shell script as sh /usr/local/hivetables/sqoop-hive.sh MYSQLUSERNAME MYSQLPASSWORD MYSQLDATABASE HIVEDATABASE

NOTE: The mysql command exports the table names in mysql databse into a text file. For loop reads the tables file and executes sqoop command for each table name.

did you export hive_conf_dir in hive-env.sh

export HIVE_CONF_DIR="/etc/hive/conf" ---you hive configuration path

I had the exact same error on the quickstart VM by cloudera using parcel mode.

I copied hive-site.xml to the sqoop conf directory :

sudo cp /etc/hive/conf/hive-site.xml /etc/sqoop/conf/

That trick solved the problem.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!