I\'m trying to install Spark on my Windows desktop. Everything should work fine, but I get an error \"\'cmd\' is not recognized as an internal or external command... \"
My colleague solved the problem. Although Java seemed to work ok (ref. picture), the Java path Spark was trying to read was incorrect with an extra \bin at the end. When that was removed, Spark started working! @gonbe, thank you so much for your efforts to help!
Check the java jdk version and scala version according to the below version compatibility table:
For my case I had similar issue. I had to fix a couple of things.
1- Check JAVA_HOME is correct on both places;
2- Then I had to change following lines in spark-2.1.1-bin-hadoop2.7\bin folder.
All my variables were OK, so I decided to debug the scripts and I found in "spark-class2.cmd" and put another pair of quotation marks in "%RUNNERS%". BEFORE "%RUNNER%" -Xmx128m -cp "%LAUNCH_CLASSPATH%" .... AFTER ""%RUNNER%"" -Xmx128m -cp "%LAUNCH_CLASSPATH%" ....