pyspark error does not exist in the jvm error when initializing SparkContext

后端 未结 10 1626
一向
一向 2021-01-07 22:32

I am using spark over emr and writing a pyspark script, I am getting an error when trying to

from pyspark import SparkContext
sc = SparkContext()
         


        
10条回答
  •  逝去的感伤
    2021-01-07 22:45

    Try to install spark 2.4.5 version, and set spark home path to this version. Even I faced the issue after changing the version, it got resolved for me.

提交回复
热议问题