I am using spark over emr and writing a pyspark script, I am getting an error when trying to
from pyspark import SparkContext
sc = SparkContext()
>
Instead of editing the Environment Variables, you might just ensure that the Python environment (the one with pyspark) also has the same py4j version as the zip file present in the \python\lib\ dictionary within you Spark folder. E.g., d:\Programs\Spark\python\lib\py4j-0.10.7-src.zip on my system, for Spark 2.3.2. It's the py4j version shipped as part of the Spark archive file.