Sqoop function '--map-column-hive' being ignored

落爺英雄遲暮 提交于 2020-01-07 03:45:31

问题


I am trying to import a file into hive as parquet and the --map-column-hive column_name=timestamp is being ignored. The column 'column_name' is originally of type datetime in sql and it converts it into bigint in parquet. I want to convert it to timestamp format through sqoop but it is not working.

sqoop import \

--table table_name \

--driver com.microsoft.sqlserver.jdbc.SQLServerDriver \

--connect jdbc:sqlserver://servername \

--username user --password pw \

--map-column-hive column_name=timestamp\

--as-parquetfile \

--hive-import \

--hive-table table_name -m 1

When I view the table in hive, it still shows the column with its original datatype.

I tried column_name=string and that did not work either.

I think this may be an issue with converting files to parquet but I am not sure. Does anyone have a solution to fix this?

I get no errors when running the command, it just completes the import as if the command was did not exist.


回答1:


Before hive 1.2 version Timestmap support in ParquetSerde is not avabile. Only binary data type support is available in 1.1.0.

Please check the link

Please upgrade your version to 1.2 and after ,it should work.

Please check the issue log and release notes below.

https://issues.apache.org/jira/browse/HIVE-6384

https://issues.apache.org/jira/secure/ReleaseNote.jspa?version=12329345&styleName=Text&projectId=12310843


来源:https://stackoverflow.com/questions/39798900/sqoop-function-map-column-hive-being-ignored

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!