ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed

后端 未结 4 1868
被撕碎了的回忆
被撕碎了的回忆 2021-02-07 03:43

I have install below setup with version: Hadoop version 1.0.3 java version \"1.7.0_67\" Scala version 2.11.7 Spark version 2.1.1.

getting below error, can any one hel

4条回答
  •  清歌不尽
    2021-02-07 04:17

    Add SPARK_LOCAL_IP in load-spark-env.sh as

    export SPARK_LOCAL_IP="127.0.0.1"
    

    The load-spark-env.sh file is located in spark/bin directory

    Or you can add your hostname in /etc/hosts file as

    127.0.0.1   hostname 
    

    You can get your hostname by typing hostname in terminal

    Hope this solves the issue!

提交回复
热议问题