ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed

后端 未结 4 1863
被撕碎了的回忆
被撕碎了的回忆 2021-02-07 03:43

I have install below setup with version: Hadoop version 1.0.3 java version \"1.7.0_67\" Scala version 2.11.7 Spark version 2.1.1.

getting below error, can any one hel

相关标签:
4条回答
  • 2021-02-07 04:17

    Add SPARK_LOCAL_IP in load-spark-env.sh as

    export SPARK_LOCAL_IP="127.0.0.1"
    

    The load-spark-env.sh file is located in spark/bin directory

    Or you can add your hostname in /etc/hosts file as

    127.0.0.1   hostname 
    

    You can get your hostname by typing hostname in terminal

    Hope this solves the issue!

    0 讨论(0)
  • 2021-02-07 04:24

    There are a few different solutions

    1. Get your hostname

      $ hostname
      

      then try to assign your host name

      $ sudo hostname -s 127.0.0.1
      

      Start spark-shell.

    2. Add your hostname to your /etc/hosts file (if not present)

      127.0.0.1      your_hostname
      
    3. Add env variable

      export SPARK_LOCAL_IP="127.0.0.1" 
      
      load-spark-env.sh 
      
    4. Above steps solved my problem but you can also try to add

      export SPARK_LOCAL_IP=127.0.0.1 
      

      under the comment for local IP on template file spark-env.sh.template (/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)

      and then

      cp spark-env.sh.template spark-env.sh
      spark-shell
      
    5. If none of the above fixes, check your firewall and enable it, if not already enabled

    0 讨论(0)
  • 2021-02-07 04:38
    • Had similar issue in my IntelliJ

      Reason : I was on cisco anyconnect VPN

      Fix : disconnected from the VPN, this issue did not appear

    0 讨论(0)
  • 2021-02-07 04:39
    1. in your terminal by typing hostname you can have a look at your current hostname.
    2. vim /etc/hosts and set the hostname you get just now to your exact ip or 127.0.0.1.
    0 讨论(0)
提交回复
热议问题