hadoop fs -put command

让人想犯罪 __ 提交于 2019-12-20 11:14:52

问题


I have constructed a single-node Hadoop environment on CentOS using the Cloudera CDH repository. When I want to copy a local file to HDFS, I used the command:

sudo -u hdfs hadoop fs -put /root/MyHadoop/file1.txt /

But,the result depressed me:

put: '/root/MyHadoop/file1.txt': No such file or directory

I'm sure this file does exist.

Please help me,Thanks!


回答1:


As user hdfs, do you have access rights to /root/ (in your local hdd)?. Usually you don't. You must copy file1.txt to a place where hdfs user has read rights.

Try:

cp /root/MyHadoop/file1.txt /tmp
chown hdfs:hdfs /tmp/file1.txt
sudo -u hdfs hadoop fs -put /tmp/file1.txt /

--- edit:

Take a look at the cleaner roman-nikitchenko's answer bellow.




回答2:


I had the same situation and here is my solution:

 HADOOP_USER_NAME=hdfs hdfs fs -put /root/MyHadoop/file1.txt /

Advantages:

  1. You don't need sudo.
  2. You don't need actually appropriate local user 'hdfs' at all.
  3. You don't need to copy anything or change permissions because of previous points.



回答3:


try to create a dir in the HDFS by usig: $ hadoop fs -mkdir your_dir and then put it into it $ hadoop fs -put /root/MyHadoop/file1.txt your_dir



来源:https://stackoverflow.com/questions/18484939/hadoop-fs-put-command

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!