Spark worker can not connect to Master

前端 未结 2 808
一整个雨季
一整个雨季 2021-01-15 15:10

While starting the worker node I get the following error :

Spark Command: /usr/lib/jvm/default-java/bin/java -cp /home/ubuntu/spark-1.5.1-bin-hadoop2.6/sbin/         


        
相关标签:
2条回答
  • 2021-01-15 15:31

    So, after some tinkering around I found that slave was not able to communicate with Master on the given port. I changed the security access rules and enabled all TCP traffic on all ports . This solved the problem.

    To check if the port is open :

    telnet master.ip master.port

    The default port is 7077.

    My spark-env.sh :

    export SPARK_WORKER_INSTANCES=2 export SPARK_MASTER_IP=<ip address>

    0 讨论(0)
  • 2021-01-15 15:34

    I'm afraid your hostname may be invalid to Spark, and you hava to change your spark-env.sh.

    You can set the variable SPARK_MASTER_IP to be the real ip of master, instead of its hostname. e.g.

    export SPARK_MASTER_IP=1.70.44.5
    

    INSTEAD OF

    export SPARK_MASTER_IP=ip-1-70-44-5
    
    0 讨论(0)
提交回复
热议问题