Spark: Trying to run spark-shell, but get 'cmd' is not recognized as an internal or

前端 未结 10 1803
盖世英雄少女心
盖世英雄少女心 2021-01-11 22:55

I\'m trying to install Spark on my Windows desktop. Everything should work fine, but I get an error \"\'cmd\' is not recognized as an internal or external command... \"

10条回答
  •  不知归路
    2021-01-11 23:28

    (I'm not Windows Spark user) The spark-shell.cmd for Windows source code expects "cmd" command is available in PATH.

    https://github.com/apache/spark/blob/master/bin/spark-shell.cmd

    Would you try adding the directory that contains "cmd.exe" in PATH environment variable? The directory location is shown title bar in your screenshot, and environment variable setting can be done via control panel.

提交回复
热议问题