pyspark error does not exist in the jvm error when initializing SparkContext

后端 未结 10 1595
一向
一向 2021-01-07 22:32

I am using spark over emr and writing a pyspark script, I am getting an error when trying to

from pyspark import SparkContext
sc = SparkContext()
         


        
10条回答
  •  小蘑菇
    小蘑菇 (楼主)
    2021-01-07 22:50

    Just to make it simple,It's all about python and java couldn't talk because the medium the have to speak out (py4j) are different, that's it.I had same issue and all those above answers are valid and will work if you use them correctly, It's either you define a system variable to tell both which py4j they should use, or you can make some un-installation and installation back so that everyone will be on same page.

提交回复
热议问题