hive.HiveImport: FAILED: SemanticException [Error 10072]: Database does not exist:

China☆狼群 提交于 2019-12-08 08:15:56

问题


I am trying to import MySQL database into Hive to analysis of large MySQL Data according to Blog there are couple of ways to do this

  1. Non realtime: Sqoop
  2. Realtime: Hadoop Applier for MySQL

so I decided to go with the 'Non realtime' approach and I have setup the Hadoop cluster with 4 node, Sqoop and Hive which working fine with following versions

Name Version

Apache Hadoop 2.6.0

Apache Hive hive-0.14.0

Apache Sqoop sqoop-1.4.5.bin__hadoop-2.0.4-alpha

Now when I am trying to import data using following command

Import Command

sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt.db  --hive-overwrite

then I am getting following error

Error

INFO hive.HiveImport: FAILED: SemanticException [Error 10072]: Database does not exist: edgeowt.db
15/04/16 13:32:09 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 88

I logged in with Hiveserver2 and check the database, but I can able to see the given database

$HIVE_HOME/bin>beeline
beeline> !connect jdbc:hive2://localhost:10000 scott tiger org.apache.hive.jdbc.HiveDriver
0: jdbc:hive2://localhost:10000> show databases;
+----------------+--+
| database_name  |
+----------------+--+
| default        |
| edgeowt        |
+----------------+--+

After looking into HDFS file System web interface I realize that the Owner of the DB is different

Permission  Owner   Group       Size    Replication Block Size  Name
drwxr-xr-x  hduser  supergroup  0 B 0       0 B     candidate
drwxr-xr-x  scott   supergroup  0 B 0       0 B     edgeowt.db
drwxr-xr-x  scott   supergroup  0 B 0       0 B     hd_temperature

as I am trying to import the data using hduser and and the database is created using scott user. I tried to grant ALL previlages to hduser on edgeowt.db using following command

0: jdbc:hive2://localhost:10000>GRANT ALL ON DATABASE edgeowt TO USER hduser;

and check with

0: jdbc:hive2://localhost:10000> SHOW GRANT ON DATABASE edgeowt;
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
| database  | table  | partition  | column  | principal_name  | principal_type  | privilege  | grant_option  |   grant_time   | grantor  |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
| edgeowt   |        |            |         | admin           | ROLE            | ALL        | false         | 1429170366000  | scott    |
| edgeowt   |        |            |         | hduser          | USER            | ALL        | false         | 1429170906000  | scott    |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+

but unable to solve the error , So how can I solve this problem ? or any point to solve this will be helpful.

~/.bashrc

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-7-oracle
# Set Hadoop-related environment variables
export HADOOP_INSTALL=/opt/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_HOME=$HADOOP_INSTALL

# Set hive home 
export HIVE_HOME=/opt/hive
export PATH=$PATH:$HIVE_HOME/bin

# Set HCatlog home 
export HCAT_HOME=$HIVE_HOME/hcatalog
export PATH=$PATH:$HCAT_HOME/bin

# Set Sqoop home
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
export SQOOP_HOME=/opt/sqoop
export SQOOP_CONF_DIR="$SQOOP_HOME/conf"
export SQOOP_CLASSPATH="$SQOOP_CONF_DIR"
export PATH=$PATH:$SQOOP_HOME/bin

EDIT

tried with following command still having same error

sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt --hive-overwrite

回答1:


Finally I got the answer by myself while reading the forum discussion about the same issue here.

The issue was with the Hive Metastore configuration, there are three types of Hive Metastore configurations

  1. Embedded Metastore (default metastore deployment mode).
  2. Locall Metastore.
  3. Remote Metastore.

and my Hive Metastore configuration was the default one. As mention in cloudera documentation of Configuring the Hive Metastore, I change the metastore configuration from Embedded (Default) to Remote Metastore and its start working for me.

for More information of Metastore configuration use following documentation of Cloudera.

Configuring the Hive Metastore

Sqoop Command

sqoop-import-all-tables --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --hive-database edgeowt --hive-overwrite -m 4



回答2:


sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt.db --hive-overwrite

Change --hive-database edgewot.db to --hive-table edgewot.db, where edgewot is your hive database name and db is your hive tablename.

sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-table edgeowt.db  --hive-overwrite

NOTE: There is no --hive-database options in sqoop-1.4.5. Please refer table 8 Hive Arguments of Section 7.2.11 in http://sqoop.apache.org/docs/1.4.5/SqoopUserGuide.html

--ALTERNATE METHOD--

If sqoop import-all-tables fails, try the following steps:

1. Create a folder named hivetables in /usr/local (local filesystem). Change permissions for the folder sudo chmod -R 777 /usr/local/hivetables.

2. Create a shell script named sqoop-hive.sh in /usr/local/hivetables and change permissions for this file sudo chmod -R 777 /usr/local/hivetables/sqoop-hive.sh

3. Paste this in sqoop-hive.sh file:

#!/bin/sh
#tabfold="/usr/local/hivetables"
#cd $tabfold
mysql -u $1 -p$2 -N information_schema -e "select table_name from tables where table_schema = '$3'" > /usr/local/hivetables/tables.txt
HOSTFILE="/usr/local/hivetables/tables.txt"
for host in $(cat $HOSTFILE)
do
   $SQOOP_HOME/bin/sqoop import --connect jdbc:mysql://localhost:3306/$3 --table $host --username $1 --password $2 --hive-import --hive-table $4.$host --warehouse-dir /user/hive/warehouse
done

4. Execute the shell script as sh /usr/local/hivetables/sqoop-hive.sh MYSQLUSERNAME MYSQLPASSWORD MYSQLDATABASE HIVEDATABASE

NOTE: The mysql command exports the table names in mysql databse into a text file. For loop reads the tables file and executes sqoop command for each table name.




回答3:


did you export hive_conf_dir in hive-env.sh

export HIVE_CONF_DIR="/etc/hive/conf" ---you hive configuration path




回答4:


I had the exact same error on the quickstart VM by cloudera using parcel mode.

I copied hive-site.xml to the sqoop conf directory :

sudo cp /etc/hive/conf/hive-site.xml /etc/sqoop/conf/

That trick solved the problem.



来源:https://stackoverflow.com/questions/29670830/hive-hiveimport-failed-semanticexception-error-10072-database-does-not-exis

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!