问题
Getting this error while trying to execute any hadoop related cmd operations:
`Error: JAVA_HOME is incorrectly set.
Please update C:\Hadoop\hadoop-2.7.2\conf\hadoop-env.cmd '-Xmx512m' is not recognized as an internal or external command, operable program or batch file.`
My JAVA_HOME is set to C:\Program Fies(x86)\Java\jdk1.8.0_91
in the environment variables.
I've also changed C:\Hadoop\hadoop-2.7.2\etc\hadoop-env.sh
, and made JAVA_HOME
equal to the above value.
回答1:
Spacing is the problem here. Install java as C:\java\jdk
instead of C:\Program Fies(x86)\Java\jdk1.8.0_91
, This is worked for me in windows 8.1.
回答2:
Please try editing hadoop-env.cmd (Windows command script) instead of hadoop-env.sh file and set path to JAVA_HOME.
Explanation : ->Ensure your jdk path does not contain space in its path(Example : C:\Program Files\java.: here "Program Files" contains space which cannot be parsed by hadoop){If there is a space in jdk path, then install java in some other path with no space in the path name} -> Right click hadoop-env.cmd and edit with notepad -> set JAVA_HOME = (Example: set JAVA_HOME=C:\java)
This worked for me, I've installed Apache Spark on windows, with scala and scala IDE installed(using Eclipse -> Maven project as scala IDE), solved winutils Error and finally solved this error to make spark work on my windows. please feel free to ask for any doubts regarding these.
回答3:
set JAVA_HOME in hadoop-env.cmd for windows7 with java jdk location without spaces in the path. I too faced issue.
Initially JDK path was - C:\Program Files\Java\jdk1.8.0_144
Replaced with - C:\Java\jdk1.8.0_144
Now Hadoop has started properly through CMD
来源:https://stackoverflow.com/questions/38077006/error-while-installing-hadoop-2-7-2-on-windows-10