Spark web UI unreachable

萝らか妹 提交于 2019-12-11 11:46:31

问题


i have installed spark2.0.0 on 12 nodes (in cluster standalone mode), when i launch it i get this :

./sbin/start-all.sh

starting org.apache.spark.deploy.master.Master, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.master.Master-1-ibnb25.out

localhost192.17.0.17: ssh: Could not resolve hostname localhost192.17.0.17: Name or service not known

192.17.0.20: starting org.apache.spark.deploy.worker.Worker, logging to /home/mbala/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb28.out

192.17.0.21: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb29.out

192.17.0.19: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb27.out

192.17.0.18: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb26.out

192.17.0.24: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb32.out

192.17.0.22: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb30.out

192.17.0.25: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb33.out

192.17.0.28: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb36.out

192.17.0.27: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb35.out

192.17.0.17: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb25.out

192.17.0.26: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb34.out

192.17.0.23: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb31.out

i have already set the port o master Port=8081 and its IP=192.17.0.17 means the HOSTNAME=ibnb25, i launched the cluster from this host.

from my local machine i use this command to access to the cluster

 ssh mName@xx.xx.xx.xx 

and when i wanted to access to the web UI from my local machine, i used the IPaddress of the master (HOST ibnb25)

192.17.0.17:8081

but it couldn't be displayed, so i tried with the address that i use to access to the cluster

xx.xx.xx.xx:8081

but nothing is displaying on my browser..... what is wrong?? pleaseeee help me


回答1:


Your /etc/hosts file seems to be incorrectly set up.

You should get hostname and IP with following commands:

hostname
hostname -i

Make sure there is space between hostname and IP.

Sample /etc/hosts file looks like :

192.17.0.17  <hostname>
192.17.0.17  localhost
<Other IP1>  <other hostname1>
.
.
.
<Other IP-n>  <other hostname-n>

Make sure to have all IP host entries in cluster on each node in /etc/hosts file.

For FQDN read this.



来源:https://stackoverflow.com/questions/39226544/spark-web-ui-unreachable

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!