I just installed pyspark 2.2.0 using conda (using python v3.6 on windows 7 64bit, java v1.8)
$conda install pyspark
It downloaded and seemed to
In my case, the problem was caused by double path. remove spark path from environment.
pip uninstall pyspark pip install pyspark