Spark: Trying to run spark-shell, but get 'cmd' is not recognized as an internal or

前端 未结 10 1786
盖世英雄少女心
盖世英雄少女心 2021-01-11 22:55

I\'m trying to install Spark on my Windows desktop. Everything should work fine, but I get an error \"\'cmd\' is not recognized as an internal or external command... \"

10条回答
  •  囚心锁ツ
    2021-01-11 23:34

    For my case I had similar issue. I had to fix a couple of things.

    1- Check JAVA_HOME is correct on both places;

    2- Then I had to change following lines in spark-2.1.1-bin-hadoop2.7\bin folder.

    • Add extra quotation marks around "%RUNNER%". So it will be like ""%RUNNER%""
    • Then execute .\spark-shell.cmd again.

提交回复
热议问题