Spark: Trying to run spark-shell, but get 'cmd' is not recognized as an internal or

前端 未结 10 1788
盖世英雄少女心
盖世英雄少女心 2021-01-11 22:55

I\'m trying to install Spark on my Windows desktop. Everything should work fine, but I get an error \"\'cmd\' is not recognized as an internal or external command... \"

相关标签:
10条回答
  • 2021-01-11 23:32

    My colleague solved the problem. Although Java seemed to work ok (ref. picture), the Java path Spark was trying to read was incorrect with an extra \bin at the end. When that was removed, Spark started working! @gonbe, thank you so much for your efforts to help!

    0 讨论(0)
  • 2021-01-11 23:32

    Check the java jdk version and scala version according to the below version compatibility table:

    0 讨论(0)
  • 2021-01-11 23:34

    For my case I had similar issue. I had to fix a couple of things.

    1- Check JAVA_HOME is correct on both places;

    2- Then I had to change following lines in spark-2.1.1-bin-hadoop2.7\bin folder.

    • Add extra quotation marks around "%RUNNER%". So it will be like ""%RUNNER%""
    • Then execute .\spark-shell.cmd again.

    0 讨论(0)
  • 2021-01-11 23:35

    All my variables were OK, so I decided to debug the scripts and I found in "spark-class2.cmd" and put another pair of quotation marks in "%RUNNERS%". BEFORE "%RUNNER%" -Xmx128m -cp "%LAUNCH_CLASSPATH%" .... AFTER ""%RUNNER%"" -Xmx128m -cp "%LAUNCH_CLASSPATH%" ....

    0 讨论(0)
提交回复
热议问题