Hadoop : Reading ORC files and putting into RDBMS?

戏子无情 提交于 2019-12-21 20:05:10

问题


I have a hive table which is stored in ORC files format. I want to export the data to a Teradata database. I researched sqoop but could not find a way to export ORC files. Is there a way to make sqoop work for ORC ? or is there any other tool that I could use to export the data ?

Thanks.


回答1:


You can use Hcatalog

sqoop export --connect "jdbc:sqlserver://xxxx:1433;databaseName=xxx;USERNAME=xxx;PASSWORD=xxx" --table rdmsTableName --hcatalog-database hiveDB --hcatalog-table hiveTableName



来源:https://stackoverflow.com/questions/36475364/hadoop-reading-orc-files-and-putting-into-rdbms

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!