pyspark error does not exist in the jvm error when initializing SparkContext

后端 未结 10 1625
一向
一向 2021-01-07 22:32

I am using spark over emr and writing a pyspark script, I am getting an error when trying to

from pyspark import SparkContext
sc = SparkContext()
         


        
10条回答
  •  逝去的感伤
    2021-01-07 22:57

    when i download new version pip install from anaconda command prompt, i get same issue.

    when i use top of the code file:
    import findspark findspark.init("c:\spark")

    this code solved my problem.

提交回复
热议问题