I want to install Spark Standlone mode to a Cluster with my two virtual machines.
With the version of spark-0.9.1-bin-hadoop1, I execute spark-shell successfully in each
I had faced same issue . you can resolve it by below procedure ,
first you should go to /etc/hosts
file and comment 127.0.1.1
address .
then you should go towards spark/sbin
directory , then you should started spark session by these command ,
./start-all.sh
or you can use ./start-master.sh
and ./start-slave.sh
for the same . Now if you will run spark-shell or pyspark
or any other spark component then it will automatically create spark context object sc
for you .