I\'m trying to install Spark on my Windows desktop. Everything should work fine, but I get an error \"\'cmd\' is not recognized as an internal or external command... \"
I was the getting the same error while executing Spark-shell in the command prompt.
I tried everything mentioned above but not able to resolve the issue.
So, at last I added "C:\Windows\System32" in 'PATH' variable of System Variable and it worked.
Check values in JAVA_HOME and make sure it is pointing to correct value. Add %JAVA_HOME%/bin in path value. After modification close command prompt and restart it. Write spark-shell and it will run.
I had the similar error. I fixed it after following changes:
Now it works.
Thanks guys.
These are detailed steps to resolve all these issues in Windows
Spark binaries
Winutils
Java
It can be inside a folder with space like C:\Program Files\AdoptOpenJDK\jdk-8.0.262.10-hotspot Set JAVA_HOME to C:\Program Files\AdoptOpenJDK\jdk-8.0.262.10-hotspot. Please note “bin” is not part of the path.
Set “path” for the user and system to %JAVA_HOME%\bin
With the above setup a user should be able to just type “spark-shell” and spark shell should start.
(I'm not Windows Spark user) The spark-shell.cmd for Windows source code expects "cmd" command is available in PATH.
https://github.com/apache/spark/blob/master/bin/spark-shell.cmd
Would you try adding the directory that contains "cmd.exe" in PATH environment variable? The directory location is shown title bar in your screenshot, and environment variable setting can be done via control panel.
I had the same issue (launch spark-shell and get the the system cannot find the path) After following the above process (changing the spark_home to exclude the /bin it worked fine thanks for sharing guys