问题
I have constructed a single-node Hadoop environment on CentOS using the Cloudera CDH repository. When I want to copy a local file to HDFS, I used the command:
sudo -u hdfs hadoop fs -put /root/MyHadoop/file1.txt /
But,the result depressed me:
put: '/root/MyHadoop/file1.txt': No such file or directory
I'm sure this file does exist.
Please help me,Thanks!
回答1:
As user hdfs
, do you have access rights to /root/
(in your local hdd)?. Usually you don't.
You must copy file1.txt
to a place where hdfs
user has read rights.
Try:
cp /root/MyHadoop/file1.txt /tmp
chown hdfs:hdfs /tmp/file1.txt
sudo -u hdfs hadoop fs -put /tmp/file1.txt /
--- edit:
Take a look at the cleaner roman-nikitchenko's answer bellow.
回答2:
I had the same situation and here is my solution:
HADOOP_USER_NAME=hdfs hdfs fs -put /root/MyHadoop/file1.txt /
Advantages:
- You don't need
sudo
. - You don't need actually appropriate local user 'hdfs' at all.
- You don't need to copy anything or change permissions because of previous points.
回答3:
try to create a dir in the HDFS by usig: $ hadoop fs -mkdir your_dir
and then put it into it $ hadoop fs -put /root/MyHadoop/file1.txt your_dir
来源:https://stackoverflow.com/questions/18484939/hadoop-fs-put-command