I just installed pyspark 2.2.0 using conda (using python v3.6 on windows 7 64bit, java v1.8)
$conda install pyspark
It downloaded and seemed to
export PYSPARK_PYTHON=python3.5
This worked for me when I was having PATH issues. Hope it helps. If not, check out your config files.
PySpark from PyPi (i.e. installed with pip
or conda
) does not contain the full PySpark functionality; it is only intended for use with a Spark installation in an already existing cluster, in which case you might want to avoid downloading the whole Spark distribution. From the docs:
The Python packaging for Spark is not intended to replace all of the other use cases. This Python packaged version of Spark is suitable for interacting with an existing cluster (be it Spark standalone, YARN, or Mesos) - but does not contain the tools required to setup your own standalone Spark cluster. You can download the full version of Spark from the Apache Spark downloads page.
If you intend to work in the PySpark shell, I suggest you download Spark as said above (PySpark is an essential component of it).
In my case, the problem was caused by double path. remove spark path from environment.
pip uninstall pyspark
pip install pyspark
It seems to be a Java path problem.
I had the same issue and had exactly the same response, and JAVA_HOME path was not set.
Check this page and see Moustafa Mahmoud's answer. I only had to take care of suggestion 1. (JAVA_HOME variable).
Also see link as well.