ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed

后端 未结 4 1864
被撕碎了的回忆
被撕碎了的回忆 2021-02-07 03:43

I have install below setup with version: Hadoop version 1.0.3 java version \"1.7.0_67\" Scala version 2.11.7 Spark version 2.1.1.

getting below error, can any one hel

4条回答
  •  你的背包
    2021-02-07 04:24

    There are a few different solutions

    1. Get your hostname

      $ hostname
      

      then try to assign your host name

      $ sudo hostname -s 127.0.0.1
      

      Start spark-shell.

    2. Add your hostname to your /etc/hosts file (if not present)

      127.0.0.1      your_hostname
      
    3. Add env variable

      export SPARK_LOCAL_IP="127.0.0.1" 
      
      load-spark-env.sh 
      
    4. Above steps solved my problem but you can also try to add

      export SPARK_LOCAL_IP=127.0.0.1 
      

      under the comment for local IP on template file spark-env.sh.template (/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)

      and then

      cp spark-env.sh.template spark-env.sh
      spark-shell
      
    5. If none of the above fixes, check your firewall and enable it, if not already enabled

提交回复
热议问题