问题
Stack : Installed HDP-2.3.2.0-2950 using Ambari 2.1 The steps that I am following :
- Load SQL server tables onto HDFS using Sqoop
- Create EXTERNAL tables in Hive
I didn't use anything pertaining to charset/unicode/utf-8 while executing the sqoop import commands and the import was successful
While creating the Hive external table, I was wondering what data type shall I select for the nvarchar columns in the original sql server table, now I am worried that even in Sqoop while importing that needs to be addressed.
- Couldn't find relevant charset/nvarchar etc. options in Sqoop import
- In Hive, can varchar/string blindly be used in place of nvarchar
回答1:
We know that nvarchar
type is not understandable by sqoop
then we need to just cast it as varchar
e.g.
select
CAST(col1 AS varchar ) AS col1,
col2,
col3,
col4
from table_name
来源:https://stackoverflow.com/questions/37033391/how-to-load-and-store-nvarchar