I have install below setup with version: Hadoop version 1.0.3 java version \"1.7.0_67\" Scala version 2.11.7 Spark version 2.1.1.
getting below error, can any one hel
Add SPARK_LOCAL_IP
in load-spark-env.sh
as
export SPARK_LOCAL_IP="127.0.0.1"
The load-spark-env.sh
file is located in spark/bin
directory
Or you can add your hostname
in /etc/hosts
file as
127.0.0.1 hostname
You can get your hostname
by typing hostname
in terminal
Hope this solves the issue!
There are a few different solutions
Get your hostname
$ hostname
then try to assign your host name
$ sudo hostname -s 127.0.0.1
Start spark-shell
.
Add your hostname to your /etc/hosts file (if not present)
127.0.0.1 your_hostname
Add env variable
export SPARK_LOCAL_IP="127.0.0.1"
load-spark-env.sh
Above steps solved my problem but you can also try to add
export SPARK_LOCAL_IP=127.0.0.1
under the comment for local IP on template file spark-env.sh.template
(/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/
)
and then
cp spark-env.sh.template spark-env.sh
spark-shell
If none of the above fixes, check your firewall and enable it, if not already enabled
Had similar issue in my IntelliJ
Reason : I was on cisco anyconnect VPN
Fix : disconnected from the VPN, this issue did not appear
hostname
you can have a look at your current hostname.vim /etc/hosts
and set the hostname you get just now to your exact ip or 127.0.0.1.