I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using
\"./bin/spark-shell\".
It has the error below.
I always get that when switching between networks. This solves it:
$ sudo hostname -s 127.0.0.1
Just set the spark.driver.host
to be your localhost if you use IDE
SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("AnyName").set("spark.driver.host", "localhost");
JavaSparkContext sc = new JavaSparkContext(conf);
sparkContext = new JavaSparkContext("local[4]", "Appname")
export SPARK_LOCAL_IP=127.0.0.1
just doing above worked for me.
In Mac,Check IP in System Preference -> Network -> click the Wifi you are connected(it should show green icon) -> check the IP just above your Network Name.
Make following entry in ../conf/spark-env.sh :
SPARK_MASTER_HOST=<<your-ip>>
SPARK_LOCAL_IP=<<your-ip>>
and than try spark-shell. Doing above changes worked for me.
I've built it from the current master branch with version 2.0.0-SNAPSHOT
. After adding export SPARK_LOCAL_IP="127.0.0.1"
to load-spark-env.sh
it worked for me. I'm using Macos 10.10.5. So it could be version issue?
export SPARK_LOCAL_IP=127.0.0.1 (for mac .bash_profile)