With a fresh install of Spark 2.1, I am getting an error when executing the pyspark command.
Traceback (most recent call last):
File \"/usr/local/spark/pytho
I was getting this error trying to run pyspark and spark-shell when my HDFS wasn't started.
I was getting same error in windows environment and Below trick worked for me.
in shell.py
the spark session is defined with .enableHiveSupport()
spark = SparkSession.builder\
.enableHiveSupport()\
.getOrCreate()
Remove hive support and redefine spark session as below:
spark = SparkSession.builder\
.getOrCreate()
you can find shell.py
in your spark installation folder.
for me it's in "C:\spark-2.1.1-bin-hadoop2.7\python\pyspark"
Hope this helps
You are missing the spark-hive jar.
For example, if you are running on Scala 2.11, with Spark 2.1, you can use this jar.
https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.11/2.1.0
Project location and file permissions would be issue. I have observed this error happening inspite of changes to my pom file.Then i changed the directory of my project to user directory where i have full permissions, this solved my issue.
I have removed ".enableHiveSupport()\" from shell.py file and its working perfect
/*****Before********/ spark = SparkSession.builder\ .enableHiveSupport()\ .getOrCreate()
/*****After********/
spark = SparkSession.builder\ .getOrCreate()
/*************************/
I saw this error on a new (2018) Mac, which came with Java 10. The fix was to set JAVA_HOME
to Java 8:
export JAVA_HOME=`usr/libexec/java_home -v 1.8`