I have install below setup with version: Hadoop version 1.0.3 java version \"1.7.0_67\" Scala version 2.11.7 Spark version 2.1.1.
getting below error, can any one hel
There are a few different solutions
Get your hostname
$ hostname
then try to assign your host name
$ sudo hostname -s 127.0.0.1
Start spark-shell
.
Add your hostname to your /etc/hosts file (if not present)
127.0.0.1 your_hostname
Add env variable
export SPARK_LOCAL_IP="127.0.0.1"
load-spark-env.sh
Above steps solved my problem but you can also try to add
export SPARK_LOCAL_IP=127.0.0.1
under the comment for local IP on template file spark-env.sh.template
(/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/
)
and then
cp spark-env.sh.template spark-env.sh
spark-shell
If none of the above fixes, check your firewall and enable it, if not already enabled