Copy from Hadoop to local machine

一曲冷凌霜 提交于 2019-12-23 06:07:54

问题


I can ssh to our box and do a hadoop fs -ls /theFolder and browse in for the files, etc.. but that's all I know too :) My goal is to copy one of those files - they are Avro - on to my local home folder.

How can do this? I found also a get command but not sure how to sue that either.


回答1:


First, use hadoop fs -get /theFolder to copy it into the current directory you are ssh'ed into on your box.

Then you can use either scp or my preference of rsync to copy the files between your box and your local system like so. Here's how I'd use rsync after having used the -get, still in the same directory:

rsync -av ./theFolder username@yourlocalmachine:/home/username

This will copy theFolder from the local fs on your box into your home folder on your machine's fs. Be sure to replace username with your actual username in both cases, and yourlocalmachine with your machine's hostname or ip address.




回答2:


Using hadoop's get you can copy the files from HDFS to your box's file system. Read more about using get here.

Then, using scp (this is similar to doing ssh) you may copy those files to your local system. Read more about using scp here.



来源:https://stackoverflow.com/questions/15746458/copy-from-hadoop-to-local-machine

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!