My Spark's Worker cannot connect Master.Something wrong with Akka?

前端 未结 7 832
萌比男神i
萌比男神i 2020-12-15 23:46

I want to install Spark Standlone mode to a Cluster with my two virtual machines.
With the version of spark-0.9.1-bin-hadoop1, I execute spark-shell successfully in each

7条回答
  •  醉梦人生
    2020-12-16 00:06

    I had faced same issue . you can resolve it by below procedure , first you should go to /etc/hosts file and comment 127.0.1.1 address . then you should go towards spark/sbin directory , then you should started spark session by these command ,

    ./start-all.sh 
    

    or you can use ./start-master.sh and ./start-slave.sh for the same . Now if you will run spark-shell or pyspark or any other spark component then it will automatically create spark context object sc for you .

提交回复
热议问题