Hadoop - java.net.ConnectException: Connection refused

后端 未结 5 945
无人及你
无人及你 2021-02-15 23:54

I want connect to hdfs (in localhost) and i have a error:

Call From despubuntu-ThinkPad-E420/127.0.1.1 to localhost:54310 failed on connection exception: java.net.Connec

相关标签:
5条回答
  • 2021-02-16 00:16

    Make sure that DFS which is set to port 9000 in core-site.xml is actually started. You can check with jps command. You can start it with sbin/start-dfs.sh

    0 讨论(0)
  • 2021-02-16 00:18

    First check is if java processes are working or not by typing jps command on command line. On running jps command following processes are mandatory to run-->>

    • DataNode
    • jps
    • NameNode
    • SecondaryNameNode

    If following processes are not running then first start the name node by using following command-->> start-dfs.sh

    This worked out for me and removed the error you stated.

    0 讨论(0)
  • 2021-02-16 00:20

    I was getting similar error. Upon checking I found that my namenode service was in stopped state.

    check status of the namenode sudo status hadoop-hdfs-namenode

    if its not in started/running state

    start namenode service sudo start hadoop-hdfs-namenode

    Do keep in mind that it takes time before name node service becomes fully functional after restart. It reads all the hdfs edits in memory. You can check progress of this in /var/log/hadoop-hdfs/ using command tail -f /var/log/hadoop-hdfs/{Latest log file}

    0 讨论(0)
  • 2021-02-16 00:26

    I guess that you didn't set up your hadoop cluster correctly please follow these steps :

    Step1: begin with setting up .bashrc:

    vi $HOME/.bashrc
    

    put the following lines at the end of the file: (change the hadoop home as yours)

    # Set Hadoop-related environment variables
    export HADOOP_HOME=/usr/local/hadoop
    
    # Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
    export JAVA_HOME=/usr/lib/jvm/java-6-sun
    
    # Some convenient aliases and functions for running Hadoop-related commands
    unalias fs &> /dev/null
    alias fs="hadoop fs"
    unalias hls &> /dev/null
    alias hls="fs -ls"
    
    # If you have LZO compression enabled in your Hadoop cluster and
    # compress job outputs with LZOP (not covered in this tutorial):
    # Conveniently inspect an LZOP compressed file from the command
    # line; run via:
    #
    # $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
    #
    # Requires installed 'lzop' command.
    #
    lzohead () {
        hadoop fs -cat $1 | lzop -dc | head -1000 | less
    }
    
    # Add Hadoop bin/ directory to PATH
    export PATH=$PATH:$HADOOP_HOME/bin
    

    step 2 : edit hadoop-env.sh as following:

    # The java implementation to use.  Required.
    export JAVA_HOME=/usr/lib/jvm/java-6-sun
    

    step 3 : Now create a directory and set the required ownerships and permissions

    $ sudo mkdir -p /app/hadoop/tmp
    $ sudo chown hduser:hadoop /app/hadoop/tmp
    # ...and if you want to tighten up security, chmod from 755 to 750...
    $ sudo chmod 750 /app/hadoop/tmp
    

    step 4 : edit core-site.xml

    <property>
      <name>hadoop.tmp.dir</name>
      <value>/app/hadoop/tmp</value>
    </property>
    
    <property>
      <name>fs.default.name</name>
      <value>hdfs://localhost:54310</value>
    </property>
    

    step 5 : edit mapred-site.xml

    <property>
      <name>mapred.job.tracker</name>
      <value>localhost:54311</value>
    </property>
    

    step 6 : edit hdfs-site.xml

    <property>
      <name>dfs.replication</name>
      <value>1</value>
    </property>
    

    finally format your hdfs (You need to do this the first time you set up a Hadoop cluster)

     $ /usr/local/hadoop/bin/hadoop namenode -format
    

    hope this will help you

    0 讨论(0)
  • 2021-02-16 00:26

    I got the same issue. You can see Name node, DataNode, Resource manager and Task manager daemons are running when you type. So just do start-all.sh then all daemons start running and now you can access HDFS.

    0 讨论(0)
提交回复
热议问题