I am using spark over emr and writing a pyspark script, I am getting an error when trying to
from pyspark import SparkContext sc = SparkContext() >
from pyspark import SparkContext sc = SparkContext()
when i download new version pip install from anaconda command prompt, i get same issue.
when i use top of the code file: import findspark findspark.init("c:\spark")
this code solved my problem.