WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: `./in': No such file or directory
原因查找:
[root@db96 hadoop]# file /usr/local/hadoop/lib/native/libhadoop.so.1.0.0
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0: ELF 32-bit LSB shared object,
Intel 80386, version 1 (SYSV), dynamically linked, not stripped
是32位的hadoop,安装在了64位的linux系统上。lib包编译环境不一样,所以不能使用。
悲剧了。
解决办法:重新编译hadoop.
1.安装maven,下载并解压。
http://maven.apache.org/download.cgi //下载对应的压缩包
apache-maven-3.2.1-bin.tar
[root@db99 ~]# tar -zxvf apache-maven-3.2.1-bin.tar
[root@db99 ~]# ln -s /usr/local/apache-maven-3.2.1/ /usr/local/maven
[root@db99 local]# vim /etc/profile //添加环境变量中
export MAVEN_HOME=/usr/local/maven
export PATH=$MAVEN_HOME/bin:$PATH
mvn -version
2.protobuf的安装
https://code.google.com/p/protobuf/downloads/detail?name=protobuf-2.5.0.tar.gz
下载:protobuf-2.5.0.tar.gz 并解压
[root@db99 protobuf-2.5.0]# pwd
/root/protobuf-2.5.0
[root@db99 protobuf-2.5.0]# ./configure --prefix=/usr/local/protoc/
[root@db99 protobuf-2.5.0]# make
[root@db99 protobuf-2.5.0]# make check
[root@db99 protobuf-2.5.0]# make install
[root@db99 protobuf-2.5.0]# protoc --version
libprotoc 2.5.0
安装成功。
protoc --version
3.ant安装
首先下载ant
apache-ant-1.9.4-bin.tar.gz
[root@db99 local]# vim /etc/profile //添加环境变量中
export ANT_HOME=/usr/local/ant
export PATH=$ANT_HOME/bin:$PATH
上面准备工作已经做的差不多了,我们终于可以开始,记得进入src文件夹下,输入下面命令
mvn package -Pdist,native -DskipTests -Dtar
..............编译需要较长时间大概1个小时左右。
在目录~/hadoop-2.5.0-src/hadoop-dist/target下有文件:
hadoop-2.5.0.tar.gz
问题总结:
这里还需要在补充:
1.遇到错误1:CMake没有安装
- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-
-
- plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has
-
- occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in
-
- directory "/home/wyf/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-
-
- common/target/native"): java.io.IOException: error=2, No such file or directory
-
- -> [Help 1]
- [ERROR]
- [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
-
- switch.
- [ERROR] Re-run Maven using the -X switch to enable full debug logging.
- [ERROR]
- [ERROR] For more information about the errors and possible solutions, please
-
- read the following articles:
- [ERROR] [Help 1]
-
- http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
解决方法:
CMake没有安装
- sudo yum install cmake
或则使用
- sudo apt-get install cmake
2.遇到错误2:ant没有安装
- ERROR] Failed to execute goal org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-
-
- alpha-3:compile (hdfs) on project hadoop-hdfs: Execution hdfs of goal
-
- org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-alpha-3:compile failed: Plugin
-
- org.codehaus.mojo.jspc:jspc-maven-plugin:2.0-alpha-3 or one of its dependencies
-
- could not be resolved: Could not transfer artifact ant:ant:jar:1.6.5 from/to
-
- central (http://repo.maven.apache.org/maven2): GET request of:
-
- ant/ant/1.6.5/ant-1.6.5.jar from central failed: Read timed out -> [Help 1]
- [ERROR]
- [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
-
- switch.
- [ERROR] Re-run Maven using the -X switch to enable full debug logging.
- [ERROR]
- [ERROR] For more information about the errors and possible solutions, please
-
- read the following articles:
- [ERROR] [Help 1]
-
- http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
- [ERROR]
- [ERROR] After correcting the problems, you can resume the build with the command
- [ERROR] mvn <goals> -rf :hadoop-hdfs
解决办法:
tar zxvf apache-ant-1.9.4-bin.tar.gz
配置文件:
vi /etc/profile 生效 source /etc/profile 检验 ant -version
来源:oschina
链接:https://my.oschina.net/u/593529/blog/306425