hadoop命令执行hbase应用jar包时的环境变量加载问题

寵の児 提交于 2019-11-29 13:50:03

#问题描述

  • 使用hadoop命令执行hbase应用jar包时,报如下错误:

    [hadoop@ breath ~]$ hadoop jar ~/HbaseTest-0.1.jar Test.HtableCreate Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at Test.HtableCreate.main(HtableCreate.java:21) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 7 more

#问题分析

  • 查看hadoop的classpath发现并无hbase的相关依赖jar包

    [hadoop@ breath ~]$ hadoop classpath /opt/beh/core/hadoop/etc/hadoop:/opt/beh/core/hadoop/share/hadoop/common/lib/:/opt/beh/core/hadoop/share/hadoop/common/:/opt/beh/core/hadoop/share/hadoop/hdfs:/opt/beh/core/hadoop/share/hadoop/hdfs/lib/:/opt/beh/core/hadoop/share/hadoop/hdfs/:/opt/beh/core/hadoop/share/hadoop/yarn/lib/:/opt/beh/core/hadoop/share/hadoop/yarn/:/opt/beh/core/hadoop/share/hadoop/mapreduce/lib/:/opt/beh/core/hadoop/share/hadoop/mapreduce/:/opt/beh/core/hadoop/contrib/capacity-scheduler/*.jar

  • 尝试在在环境变量CLASSPATH中添加$HBASE_HOME/lib/*无效。

  • 查看Hbase官方文档中"HBase and MapReduce"-"HBase, MapReduce, and the CLASSPATH"详细解释了MapReduce与Hbase jar执行的关系

如果临时使用环境变量可如下设置:

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/lib/hbase-examples-VERSION.jar

如果永久生效可以在hadoop-env.sh中设置HADOOP_CLASSPATH,如下设置:

# Extra Java CLASSPATH elements.  Automatically insert capacity-scheduler.
if [ -z $HBASE_HOME  ];
then
   export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}
else
   export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:${HBASE_HOME}/lib'/*'
fi
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!