Spark: slave unable to connect to master

寵の児 提交于 2020-02-06 19:33:29

问题


I am trying to setup a standalone spark cluster on 2 machines within my organization network. Both are ubuntu 16.04 machines with same configuration.

There is passwordless ssh setup from master to slave and slave to master as well.

The config of the master and the slave node are mentioned below.

Master configuration:

Spark-version: 2.4.4
/etc/hosts : 10.x.x.2 master
             10.x.x.3 slave01
            127.0.0.1 dm.abc.net localhost dm
            127.0.1.1 
            10.x.x.4  dm.abc.net dm
            10.x.x.5  abc.net

/usr/local/spark/conf/slaves: 10.x.x.3

/usr/local/spark/conf/spark-env.sh: export SPARK_MASTER_HOST='10.x.x.2'
                                    export SPARK_LOCAL_IP='10.x.x.2'
                                    export JAVA_HOME='/usr/lib/jvm/java-8-openjdk-amd64/jre/'

Slave configuration:

Spark-version: 2.4.4
/etc/hosts : 10.x.x.2 master
             10.x.x.3 slave01
            127.0.0.1 dm1.abc.net localhost dm1
            127.0.1.1 
            10.x.x.4  dm1.abc.net dm1
            10.x.x.5  abc.net

/usr/local/spark/conf/slaves: 10.x.x.3

/usr/local/spark/conf/spark-env.sh: export SPARK_MASTER_HOST='10.x.x.2'
                                    export SPARK_LOCAL_IP='10.x.x.3'

I try the following command on the master: /usr/local/spark/sbin/start-all.sh

But I keep getting: Failed to connect to master. error in the slave log. Any pointers.

来源:https://stackoverflow.com/questions/58767418/spark-slave-unable-to-connect-to-master

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!