问题
I'm working with Ubuntu 12.04 LTS.
I'm going through the hadoop quickstart manual to make a pseudo-distributed operation. It seems simple and straightforward (easy!).
However, when I try to run start-all.sh
I get:
localhost: Error: JAVA_HOME is not set.
I've read all the other advice on stackoverflow for this issue and have done the following to ensure JAVA_HOME
is set:
In /etc/hadoop/conf/hadoop-env.sh
I have set
JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME
In /etc/bash.bashrc
I have set
JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME
PATH=$PATH:$JAVA_HOME/bin
export PATH
which java
returns:
/usr/bin/java
java –version
works
echo $JAVA_HOME
returns:
/usr/lib/jvm/java-6-oracle
I've even tried becoming root and explicitly writing the in the terminal:
$ JAVA_HOME=/usr/lib/jvm/java-6-oracle
$ export JAVA_HOME
$ start-all.sh
If you could show me how to resolve this error it would be greatly appreciated.
I'm thinking that my JAVA_HOME
is being overridden somehow. If that is the case, could you explain to me how to make my exports global?
回答1:
I am using hadoop 1.1, and faced the same problem.
I got it solved through changing JAVA_HOME
variable in /etc/hadoop/hadoop-env.sh
as:
export JAVA_HOME=/usr/lib/jvm/<jdk folder>
回答2:
The way to solve this problem is to export the JAVA_HOME variable inside the conf/hadoop-env.sh file.
It doesn't matter if you already exported that variable in ~/.bashrc, it'll still show the error.
So edit conf/hadoop-env.sh and uncomment the line "export JAVA_HOME" and add a proper filesystem path to it, i.e. the path to your Java JDK.
# The Java implementation to use. Required.
export JAVA_HOME="/path/to/java/JDK/"
回答3:
The way to debug this is to put an "echo $JAVA_HOME" in start-all.sh. Are you running your hadoop environment under a different username, or as yourself? If the former, it's very likely that the JAVA_HOME environment variable is not set for that user.
The other potential problem is that you have specified JAVA_HOME incorrectly, and the value that you have provided doesn't point to a JDK/JRE. Note that "which java" and "java -version" will both work, even if JAVA_HOME is set incorrectly.
回答4:
extract from etc/hadoop/hadoop-env.sh
The only required environment variable is JAVA_HOME. All others are optional. When running a distributed configuration it is best to set JAVA_HOME in this file, so that it is correctly defined on remote nodes.
This means its better and advised to set JAVA_HOME here.. even though the existing definition reads the JAVA_HOME variable. Perhaps its not getting the value of JAVA_HOME from previously set value... standard apache manual does not tell this :( :(
回答5:
This error is coming from Line 180
if [[ -z $JAVA_HOME ]]; then
echo "Error: JAVA_HOME is not set and could not be found." 1>&2
exit 1
fi
in libexec/hadoop-config.sh
.
Try echo $JAVA_HOME
in that script. If it doesn't recognize,
Find your JAVA_HOME
using this:
$(readlink -f /usr/bin/javac | sed "s:/bin/javac::")
and replace the line
export JAVA_HOME=${JAVA_HOME}
in /etc/hadoop/hadoop-env.sh
with JAVA_HOME you got from above command.
回答6:
I also had faced the similar problem in hadoop 1.1
I had not noticed that the JAVA_HOME
was commented in: hadoop/conf/hadoop-env.sh
It was
/#JAVA_HOME=/usr/lib/jvm/java-6-oracle
Had to change it to
JAVA_HOME=/usr/lib/jvm/java-6-oracle
回答7:
regardless of debian or any linux flavor, just know that ~/.bash_profile
belongs to specific user and is not system wide.
in pseudo-distributed environment hadoop works on localhost
so the $JAVA_HOME
in .bash_profile is no use anymore.
just export the JAVA_HOME in ~/.bashrc
and use it system wide.
回答8:
Ran into the same issue on ubuntu LTS 16.04. Running bash -vx ./bin/hadoop
showed it tested whether java was a directory. So I changed JAVA_HOME to a folder and it worked.
++ [[ ! -d /usr/bin/java ]]
++ hadoop_error 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
++ echo 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
ERROR: JAVA_HOME /usr/bin/java does not exist.
So I changed JAVA_HOME in ./etc/hadoop/hadoop-env.sh
to
export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre/
and hadoop starts fine. This is also mentioned in this article.
回答9:
Check if your alternatives is pointing to the right one, you might actually be pointing to a different version and trying to alter the hadoop-env.sh on another installed version.
-alternatives --install /etc/hadoop/conf [generic_name] [your correct path] priority {for further check man page of alternatives}
to set alternatives manually,
alternatives --set [generic name] [your current path].
回答10:
Change the JAVA_HOME
variable in conf/hadoop-env.sh
export JAVA_HOME=/etc/local/java/<jdk folder>
回答11:
echo "export JAVA_HOME=/usr/lib/java" >> $HADOOP_HOME/etc/hadoop/hadoop-env.sh
Notice: Do not use export JAVA_HOME=${JAVA_HOME}
!
来源:https://stackoverflow.com/questions/14325594/working-with-hadoop-localhost-error-java-home-is-not-set