Error while installing Hadoop 2.7.2 on Windows 10

落爺英雄遲暮 提交于 2019-12-23 01:52:56

问题


Getting this error while trying to execute any hadoop related cmd operations:

`Error: JAVA_HOME is incorrectly set. 
Please update C:\Hadoop\hadoop-2.7.2\conf\hadoop-env.cmd '-Xmx512m' is not recognized as an internal or external command, operable program or batch file.`

My JAVA_HOME is set to C:\Program Fies(x86)\Java\jdk1.8.0_91 in the environment variables. I've also changed C:\Hadoop\hadoop-2.7.2\etc\hadoop-env.sh, and made JAVA_HOME equal to the above value.


回答1:


Spacing is the problem here. Install java as C:\java\jdk instead of C:\Program Fies(x86)\Java\jdk1.8.0_91, This is worked for me in windows 8.1.




回答2:


Please try editing hadoop-env.cmd (Windows command script) instead of hadoop-env.sh file and set path to JAVA_HOME.

Explanation : ->Ensure your jdk path does not contain space in its path(Example : C:\Program Files\java.: here "Program Files" contains space which cannot be parsed by hadoop){If there is a space in jdk path, then install java in some other path with no space in the path name} -> Right click hadoop-env.cmd and edit with notepad -> set JAVA_HOME = (Example: set JAVA_HOME=C:\java)

This worked for me, I've installed Apache Spark on windows, with scala and scala IDE installed(using Eclipse -> Maven project as scala IDE), solved winutils Error and finally solved this error to make spark work on my windows. please feel free to ask for any doubts regarding these.




回答3:


set JAVA_HOME in hadoop-env.cmd for windows7 with java jdk location without spaces in the path. I too faced issue.

Initially JDK path was - C:\Program Files\Java\jdk1.8.0_144

Replaced with - C:\Java\jdk1.8.0_144

Now Hadoop has started properly through CMD



来源:https://stackoverflow.com/questions/38077006/error-while-installing-hadoop-2-7-2-on-windows-10

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!