I am trying to setup the latest Hadoop 2.2 single node cluster on Ubuntu 13.10 64 bit. the OS is a fresh installation, and I have tried using both java-6 64 bit and java-7 64 bit.
After following the steps from this and after failing, from this link, I am not able to start nodemanager
and resourcemanager
with the command:
sbin/yarn-daemon.sh start nodemanager
sudo sbin/yarn-daemon.sh start nodemanager
and resource manager with
sbin/yarn-daemon.sh start resourcemanager
sudo sbin/yarn-daemon.sh start resourcemanager
and both fails with error:
starting nodemanager, logging to /home/hduser/yarn/hadoop-2.2.0/logs/yarn-hduser-nodemanager-ubuntu.out
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/service/CompositeService
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:788)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:447)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
Resource Manager fails with similar error: NoClassDefFoundError
I have been trying this for many hours and have tried Google and nothing worked. Please let me know what I have missed. This and this link while searching for a solution didn't work.
I have tried using both java-6 and java-7 64 bit, with no success.
Edit
The accepted answer managed to get rid of the exception and all the daemons are now starting but there is still an exception while running jobs, mentioned in this question
Those instructions are stale and seem to reflecting one of the very early alpha releases. Make this change: YARN_HOME -> HADOOP_YARN_HOME. The environment variable got renamed a while back. This should fix it for you.
You can use Apache Ambari 1.4.1 that eases installation of Hadoop and many of its ecosystem components. You can see http://docs.hortonworks.com/#2.0 on how to install using Ambari.
you should see this is solution, add $HADOOP_HOME/share/ and its sub-directories. http://www.srccodes.com/p/article/46/noclassdeffounderror-org-apache-hadoop-service-compositeservice-shell-exitcodeexception-classnotfoundexception
来源:https://stackoverflow.com/questions/19552209/hadoop-nodemanager-and-resourcemanager-not-starting