问题
I need to import external database from mysql into hive using sqoop. My requirements are to import the complete database with all the tables into a specified hive database using sqoop import. for example, I want to import mysql database 'hadoop_practice' along with all its tables to hive database 'hadoop_practice'. However, when I perform the following command
$ sqoop import-all-tables --connect jdbc:mysql://localhost/hadoop_practice --username root -P --hive-import
the tables are imported into the hive default database. I have tried logging into hive shell and changing the database but it still copies to the default database in hive. I know sqoop provides options of copying external database table into hive database table, but that will require me to copy a 1000 tables by giving separate commands for each import. I still couldn't find any option which allows me to import all the tables in one instance. I have thought about writing a script, but is there any way I can do it without writing the script.
回答1:
You can specify the hive database using --hive-table options. so the command will be like
$ sqoop import-all-tables --connect jdbc:mysql://localhost/hadoop_practice --username root -P --hive-import --hive-table hadoop_practice.db
回答2:
You use the --hive-database option.
It was made available in Sqoop version 1.4.4: https://issues.apache.org/jira/browse/SQOOP-912
回答3:
$ sqoop import-all-tables --connect jdbc:mysql://localhost/hadoop_practice --username root -P --hive-import --hive-database database_name --create-hive-table
来源:https://stackoverflow.com/questions/27955290/how-do-i-import-rdbms-data-to-a-specific-hive-database-using-sqoop-import