Working With Hadoop: localhost: Error: JAVA_HOME is not set

十年热恋 提交于 2019-11-28 17:16:07
Krishna

I am using hadoop 1.1, and faced the same problem.

I got it solved through changing JAVA_HOME variable in /etc/hadoop/hadoop-env.sh as:

export JAVA_HOME=/usr/lib/jvm/<jdk folder>

The way to solve this problem is to export the JAVA_HOME variable inside the conf/hadoop-env.sh file.

It doesn't matter if you already exported that variable in ~/.bashrc, it'll still show the error.

So edit conf/hadoop-env.sh and uncomment the line "export JAVA_HOME" and add a proper filesystem path to it, i.e. the path to your Java JDK.

# The Java implementation to use. Required.
export JAVA_HOME="/path/to/java/JDK/"

The way to debug this is to put an "echo $JAVA_HOME" in start-all.sh. Are you running your hadoop environment under a different username, or as yourself? If the former, it's very likely that the JAVA_HOME environment variable is not set for that user.

The other potential problem is that you have specified JAVA_HOME incorrectly, and the value that you have provided doesn't point to a JDK/JRE. Note that "which java" and "java -version" will both work, even if JAVA_HOME is set incorrectly.

extract from etc/hadoop/hadoop-env.sh

The only required environment variable is JAVA_HOME. All others are optional. When running a distributed configuration it is best to set JAVA_HOME in this file, so that it is correctly defined on remote nodes.

This means its better and advised to set JAVA_HOME here.. even though the existing definition reads the JAVA_HOME variable. Perhaps its not getting the value of JAVA_HOME from previously set value... standard apache manual does not tell this :( :(

This error is coming from Line 180

if [[ -z $JAVA_HOME ]]; then
   echo "Error: JAVA_HOME is not set and could not be found." 1>&2
   exit 1
fi

in libexec/hadoop-config.sh.

Try echo $JAVA_HOME in that script. If it doesn't recognize,

Find your JAVA_HOME using this:

$(readlink -f /usr/bin/javac | sed "s:/bin/javac::")

and replace the line

export JAVA_HOME=${JAVA_HOME} in /etc/hadoop/hadoop-env.sh with JAVA_HOME you got from above command.

codeguru

I also had faced the similar problem in hadoop 1.1 I had not noticed that the JAVA_HOME was commented in: hadoop/conf/hadoop-env.sh

It was

/#JAVA_HOME=/usr/lib/jvm/java-6-oracle

Had to change it to

JAVA_HOME=/usr/lib/jvm/java-6-oracle

regardless of debian or any linux flavor, just know that ~/.bash_profile belongs to specific user and is not system wide. in pseudo-distributed environment hadoop works on localhost so the $JAVA_HOME in .bash_profile is no use anymore.

just export the JAVA_HOME in ~/.bashrc and use it system wide.

Ran into the same issue on ubuntu LTS 16.04. Running bash -vx ./bin/hadoop showed it tested whether java was a directory. So I changed JAVA_HOME to a folder and it worked.

++ [[ ! -d /usr/bin/java ]]
++ hadoop_error 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
++ echo 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
ERROR: JAVA_HOME /usr/bin/java does not exist.

So I changed JAVA_HOME in ./etc/hadoop/hadoop-env.sh to

export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre/

and hadoop starts fine. This is also mentioned in this article.

Check if your alternatives is pointing to the right one, you might actually be pointing to a different version and trying to alter the hadoop-env.sh on another installed version.

-alternatives --install /etc/hadoop/conf [generic_name] [your correct path] priority {for further check man page of alternatives}

to set alternatives manually,

alternatives --set [generic name] [your current path].

Raja Sekaran

Change the JAVA_HOME variable in conf/hadoop-env.sh

export JAVA_HOME=/etc/local/java/<jdk folder>
echo "export JAVA_HOME=/usr/lib/java" >> $HADOOP_HOME/etc/hadoop/hadoop-env.sh

Notice: Do not use export JAVA_HOME=${JAVA_HOME} !

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!