Errors while running hadoop

前端 未结 4 2049
甜味超标
甜味超标 2021-02-09 06:59
haduser@user-laptop:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/input 
/user/haduser/input

11/12/14 14:21:00 INFO ipc.Client: Retrying connect to server: loca         


        
相关标签:
4条回答
  • 2021-02-09 07:38

    Try to do ssh to your local system using the IP, in this case:

    $ ssh 127.0.0.1

    Once you are able to do the ssh successfully. Run the below command to know the list of open ports

    ~$ lsof -i

    look for a listening connector with name: localhost:< PORTNAME > (LISTEN)

    copy this < PORTNAME > and replace the existing value of port number in tag of fs.default.name property in your core-site.xml in the hadoop conf folder

    save the core-site.xml, this should resolve the issue.

    0 讨论(0)
  • 2021-02-09 07:42

    NameNode (NN) maintains the namespace for HDFS and it should be running for filesystem operations on HDFS. Check the logs why the NN hasn't started. TaskTracker is not required for operations on HDFS, only NN and DN are sufficient. Check the http://goo.gl/8ogSk and http://goo.gl/NIWoK tutorials on how to setup Hadoop on a single and multi node.

    0 讨论(0)
  • 2021-02-09 07:48

    All the files in the bin are exectuables. Just copy the command and paste it in the terminal. Make sure the address is right, i.e. the user must be replaced by something. That would do the trick.

    0 讨论(0)
  • 2021-02-09 07:57

    I had similar issues - Actually Hadoop was binding to IPv6. Then I Added - "export HADOOP_OPTS=-Djava.net.preferIPv4Stack=true " to $HADOOP_HOME/conf/hadoop-env.sh

    Hadoop was binding to IPv6 even when I had disabled IPv6 on my system. Once I added it to env, started working fine.

    Hope this helps someone.

    0 讨论(0)
提交回复
热议问题