hadoop-2.6.0-cdh5.15.1源码编译--支持压缩

我的未来我决定 提交于 2019-11-27 12:38:07

参考博文
https://blog.csdn.net/SUDDEV/article/details/98223999
https://blog.csdn.net/yz972641975/article/details/98405720

环境要求

[root@hadoop001 ~]# yum install gcc gcc-c++ make cmake
[root@hadoop001 ~]# yum install openssl openssl-devel svn ncurses-devel zlib-devel libtool bzip2 bzip2-devel
[root@hadoop001 ~]# yum install snappy snappy-devel lzo lzo-devel lzop lrzsz autoconf automake

安装JDK、maven、protobuf

  1. 解压
[hadoop@hadoop001 ~]# tar -zxvf ~/soft/jdk-7u80-linux-x64.tar.gz -C ~/app
[hadoop@hadoop001 ~]# tar -zxvf ~/soft/apache-maven-3.6.1-bin.tar.gz -C ~/app
[hadoop@hadoop001 ~]# tar -zxvf ~/soft/hadoop-2.6.0-cdh5.10.0.tar.gz -C ~/source
  1. 配置环境变量
[hadoop@hadoop001 ~]$ vim ~/.bash_profile
export JAVA_HOME=/home/hadoop/app/jdk1.7.0_80
export PATH=$JAVA_HOME/bin:$PATH

export MAVEN_HOME=/home/hadoop/app/apache-maven-3.6.1
export MAVEN_OPTS="-Xms1024m -Xmx1024m"
export PATH=$MAVEN_HOME/bin:$PATH

[hadoop@hadoop001 ~]$ source ~/.bash_profile
# 测试java
[hadoop@hadoop001 jdk1.7.0_80]# java -version
java version "1.7.0_80"
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
#测试maven
[hadoop@hadoop002 ~]$ mvn -version
Apache Maven 3.6.1 (1edded0938998edf8bf061f1ceb3cfdeccf443fe; 2018-06-18T02:33:14+08:00)
Maven home: /home/hadoop/app/apache-maven-3.6.1
Java version: 1.7.0_80, vendor: Oracle Corporation, runtime: /home/hadoop/jdk1.7.0_80/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.10.0-957.el7.x86_64", arch: "amd64", family: "unix"
  1. 编译以及安装protobuf
[hadoop@hadoop001 ~]$ tar -zxvf ~/soft/protobuf-2.5.0.tar.gz -C ~/app/
[hadoop@hadoop001 protobuf-2.5.0]$ ./configure  --prefix=/home/hadoop/app/protobuf-2.5.0
#编译以及安装
[hadoop@hadoop001 protobuf-2.5.0]$ make
[hadoop@hadoop001 protobuf-2.5.0]$ make install
[hadoop@hadoop001 protobuf-2.5.0]$ vim ~/.bash_profile

export PROTOBUF_HOME=/home/hadoop/app/protobuf-2.5.0
export PATH=$PROTOBUF_HOME/bin:$PATH

[hadoop@hadoop001 protobuf-2.5.0]$ source ~/.bash_profile 
#测试是否生效,若出现libprotoc 2.5.0则为生效
[hadoop@hadoop001 protobuf-2.5.0]$ protoc --version
libprotoc 2.5.0
  1. 配置maven
[hadoop@hadoop002 ~]$ vi app/apache-maven-3.5.4/conf/settings.xml
# 配置本地仓库地址,默认是  ~/.m2/repository
<localRepository>/home/hadoop/maven_repo</localRepository>

<mirror>
            <id>alimaven</id>
            <name>aliyun maven</name>
            <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
            <mirrorOf>central</mirrorOf>
</mirror>
<mirror>
            <id>cloudera</id>
            <mirrorOf>*</mirrorOf>
            <name>cloudera Readable Name for this Mirror.</name>
			<url>http://repository.cloudera.com/artifactory/cloudera-repos/</url>
</mirror>


<profiles>
	<profile>
		<id>localRep</id>
			<repositories>
				<repository>
            		<id>NEORepo</id>
            			<url>http://maven.aliyun.com/nexus/content/groups/public/</url>
            			<snapshots>
               				<enabled>true</enabled>
                			<updatePolicy>always</updatePolicy>
            			</snapshots>
        		</repository>
        		<repository>
            		<id>internal</id>
           			<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
            		<snapshots>
	              		<enabled>false</enabled>
            		</snapshots>
		    	</repository>
        	</repositories>
		<pluginRepositories>
			<pluginRepository>
		            <id>NEORepo</id>
		            <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
		            <snapshots>
		                <enabled>true</enabled>
		                <updatePolicy>always</updatePolicy>
		            </snapshots>
			</pluginRepository>
		</pluginRepositories>
	</profile>
</profiles>

编译hadoop-2.6.0-cdh5.15.1

  • 仓库文件下载地址
    链接:https://pan.baidu.com/s/1xTi42emtIwpF-uMejFTXrg
    提取码:sif3
[hadoop@hadoop001 hadoop-2.6.0-cdh5.7.0]$ cd ~/source/hadoop-2.6.0-cdh5.7.0/

# 修改hadoop-peoject下的pom文件 DynamoDBLocal 的版本为 1.11.477
<dependency>
        <groupId>com.amazonaws</groupId>
        <artifactId>DynamoDBLocal</artifactId>
        <version>1.11.477</version>
</dependency>

#进行编译,第一次编译会下载很多依赖的jar包,比较耗时
[hadoop@hadoop001 hadoop-2.6.0-cdh5.7.0]$ mvn clean package -Pdist,native -DskipTests -Dtar

出现以下信息即编译成功

[INFO] Reactor Summary for Apache Hadoop Main 2.6.0-cdh5.15.1:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [  4.219 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  1.155 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.958 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  3.636 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.570 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.866 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  4.212 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  7.142 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  7.232 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  3.184 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:43 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  5.825 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 13.792 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.070 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:37 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 18.975 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [  5.569 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  4.565 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.052 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.110 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:30 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 23.249 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.087 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [  9.989 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 16.379 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  3.302 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  5.146 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 16.507 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  1.161 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  4.934 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.062 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.486 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  1.893 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.099 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  4.609 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  3.871 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.165 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 18.647 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 15.247 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  3.782 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [  9.056 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  6.300 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [  5.448 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  2.183 s]
[INFO] hadoop-mapreduce-client-nativetask ................. SUCCESS [01:14 min]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  5.332 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  5.167 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  4.131 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  9.136 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.980 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  1.941 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  5.067 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  3.804 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.372 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.232 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  2.681 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  6.969 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  4.486 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 10.983 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  4.440 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  3.854 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.308 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  5.258 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  3.734 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 15.415 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.046 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 41.309 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  13:31 min
[INFO] Finished at: 2019-08-16T11:28:53+08:00
[INFO] ------------------------------------------------------------------------

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!