http://localhost:50070 does not work HADOOP

前端 未结 12 1320
庸人自扰
庸人自扰 2020-12-12 14:54

I already installed Hadoop on my machine \"Ubuntu 13.05\" and now I have an error when browsing localhost:50070 the browser says that the page does not exist.

相关标签:
12条回答
  • 2020-12-12 15:51

    Enable the port in your system it is for CentOS 7 flow the commands below

    1.firewall-cmd --get-active-zones

    2.firewall-cmd --zone=dmz --add-port=50070/tcp --permanent

    3.firewall-cmd --zone=public --add-port=50070/tcp --permanent

    4.firewall-cmd --zone=dmz --add-port=9000/tcp --permanent

    5.firewall-cmd --zone=public --add-port=9000/tcp --permanent 6.firewall-cmd --reload

    0 讨论(0)
  • 2020-12-12 15:53

    First, check that java processes that are running using "jps". If you are in a pseudo-distributed mode you must have the following proccesses:

    • Namenode
    • Jobtracker
    • Tasktracker
    • Datanode
    • SecondaryNamenode

    If you are missing any, use the restart commands:

    $HADOOP_INSTALL/bin/stop-all.sh
    $HADOOP_INSTALL/bin/start-all.sh
    

    It can also be because you haven't open that port on the machine:

    iptables -A INPUT -p tcp --dport 50070 -j ACCEPT
    
    0 讨论(0)
  • 2020-12-12 15:55

    First all need to do is start hadoop nodes and Trackers, simply by typing start-all.sh on ur terminal. To check all the trackers and nodes are started write 'jps' command. if everything is fine and working, go to your browser type the following url http://localhost:50070

    0 讨论(0)
  • 2020-12-12 15:56

    For recent hadoop versions (I'm using 2.7.1)

    The start\stop scripts are located in the sbin folder. The scripts are:

    • ./sbin/start-dfs.sh
    • ./sbin/stop-dfs.sh
    • ./sbin/start-yarn.sh
    • ./sbin/stop-yarn.sh

    I didn't have to do anything with yarn though to get the NameNodeServer instance running.

    Now my mistake was that I didn't format the NameNodeServer HDFS.

    bin/hdfs namenode -format
    

    I'm not quite sure what that does at the moment but it obviously prepares the space on which the NameNodeServer will use to operate.

    0 讨论(0)
  • 2020-12-12 15:57

    Try

    namenode -format
    start-all.sh
    stop-all.sh
    jps
    

    see namenode and datanode are running and browse

    localhost:50070
    

    If localhost:50070 is still not working, then you need to allows ports. So, check

    netstat -anp | grep 50070
    
    0 讨论(0)
  • 2020-12-12 16:00

    There is a similar question and answer at: Start Hadoop 50075 Port is not resolved

    Take a look at your core-site.xml file to determine which port it is set to. If 0, it will randomly pick a port, so be sure to set one.

    0 讨论(0)
提交回复
热议问题