Mac spark-shell Error initializing SparkContext

前端 未结 12 922
礼貌的吻别
礼貌的吻别 2020-12-07 09:26

I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using

\"./bin/spark-shell\". 

It has the error below.

相关标签:
12条回答
  • 2020-12-07 09:56

    I always get that when switching between networks. This solves it:

    $ sudo hostname -s 127.0.0.1

    0 讨论(0)
  • 2020-12-07 09:57

    Just set the spark.driver.host to be your localhost if you use IDE

    SparkConf conf = new  SparkConf().setMaster("local[2]").setAppName("AnyName").set("spark.driver.host", "localhost");
    JavaSparkContext sc = new JavaSparkContext(conf);
    
    0 讨论(0)
  • 2020-12-07 09:59
    sparkContext = new JavaSparkContext("local[4]", "Appname")
    

    export SPARK_LOCAL_IP=127.0.0.1
    

    just doing above worked for me.

    0 讨论(0)
  • 2020-12-07 10:01

    In Mac,Check IP in System Preference -> Network -> click the Wifi you are connected(it should show green icon) -> check the IP just above your Network Name.

    Make following entry in ../conf/spark-env.sh :

    SPARK_MASTER_HOST=<<your-ip>>
    SPARK_LOCAL_IP=<<your-ip>>
    

    and than try spark-shell. Doing above changes worked for me.

    0 讨论(0)
  • 2020-12-07 10:06

    I've built it from the current master branch with version 2.0.0-SNAPSHOT. After adding export SPARK_LOCAL_IP="127.0.0.1" to load-spark-env.sh it worked for me. I'm using Macos 10.10.5. So it could be version issue?

    0 讨论(0)
  • 2020-12-07 10:08

    export SPARK_LOCAL_IP=127.0.0.1 (for mac .bash_profile)

    0 讨论(0)
提交回复
热议问题