Spark 2.1 - Error While instantiating HiveSessionState

后端 未结 10 1941
借酒劲吻你
借酒劲吻你 2020-12-06 06:38

With a fresh install of Spark 2.1, I am getting an error when executing the pyspark command.

Traceback (most recent call last):
File \"/usr/local/spark/pytho         


        
相关标签:
10条回答
  • 2020-12-06 07:09

    I was getting this error trying to run pyspark and spark-shell when my HDFS wasn't started.

    0 讨论(0)
  • 2020-12-06 07:13

    I was getting same error in windows environment and Below trick worked for me.

    in shell.py the spark session is defined with .enableHiveSupport()

     spark = SparkSession.builder\
                .enableHiveSupport()\
                .getOrCreate()
    

    Remove hive support and redefine spark session as below:

    spark = SparkSession.builder\
            .getOrCreate()
    

    you can find shell.py in your spark installation folder. for me it's in "C:\spark-2.1.1-bin-hadoop2.7\python\pyspark"

    Hope this helps

    0 讨论(0)
  • 2020-12-06 07:13

    You are missing the spark-hive jar.

    For example, if you are running on Scala 2.11, with Spark 2.1, you can use this jar.

    https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.11/2.1.0

    0 讨论(0)
  • 2020-12-06 07:16

    Project location and file permissions would be issue. I have observed this error happening inspite of changes to my pom file.Then i changed the directory of my project to user directory where i have full permissions, this solved my issue.

    0 讨论(0)
  • 2020-12-06 07:19

    I have removed ".enableHiveSupport()\" from shell.py file and its working perfect

    /*****Before********/ spark = SparkSession.builder\ .enableHiveSupport()\ .getOrCreate()

    /*****After********/

    spark = SparkSession.builder\ .getOrCreate()

    /*************************/

    0 讨论(0)
  • 2020-12-06 07:21

    I saw this error on a new (2018) Mac, which came with Java 10. The fix was to set JAVA_HOME to Java 8:

    export JAVA_HOME=`usr/libexec/java_home -v 1.8`

    0 讨论(0)
提交回复
热议问题