问题
I can ssh to our box and do a hadoop fs -ls /theFolder
and browse in for the files, etc.. but that's all I know too :) My goal is to copy one of those files - they are Avro - on to my local home folder.
How can do this? I found also a get
command but not sure how to sue that either.
回答1:
First, use hadoop fs -get /theFolder
to copy it into the current directory you are ssh'ed into on your box.
Then you can use either scp
or my preference of rsync
to copy the files between your box and your local system like so. Here's how I'd use rsync
after having used the -get
, still in the same directory:
rsync -av ./theFolder username@yourlocalmachine:/home/username
This will copy theFolder
from the local fs on your box into your home folder on your machine's fs. Be sure to replace username
with your actual username in both cases, and yourlocalmachine
with your machine's hostname or ip address.
回答2:
Using hadoop's get
you can copy the files from HDFS to your box's file system. Read more about using get
here.
Then, using scp
(this is similar to doing ssh) you may copy those files to your local system. Read more about using scp
here.
来源:https://stackoverflow.com/questions/15746458/copy-from-hadoop-to-local-machine