hadoop 2.7.3 (hadoop2.x)使用ant制作eclipse插件hadoop-eclipse-plugin-2.7.3.jar

≯℡__Kan透↙ 提交于 2020-01-11 04:43:20

  为了做mapreduce开发,要使用eclipse,并且需要对应的Hadoop插件hadoop-eclipse-plugin-2.7.3.jar,首先说明一下,在hadoop1.x之前官方hadoop安装包中都自带有eclipse的插件,而如今随着程序员的开发工具eclipse版本的增多和差异,hadoop插件也必须要和开发工具匹配,hadoop的插件包也不可能全部兼容.为了简化,如今的hadoop安装包内不会含有eclipse的插件.需要各自根据自己的eclipse自行编译.

1. 环境准备

  使用ant制作自己的eclipse插件,介绍一下我的环境和工具  ( 安装路径根据自己 )

  系统: 64bit Ubuntu 14.04,(系统不重要Win也可以,方法都一样)

  JDK 版本: jdk-7u80-linux-x64.tar.gz   安装路径: /usr/lib/jvm   

  eclipse 版本: ide工具eclipse-jee-mars-2-linux-gtk-x86_64.tar.gz     安装路径: /home/hadoop/

  hadoop 版本: hadoop-2.7.3.tar.gz     安装路径:/usr/local

  ant(这个也随意,二进制安装或者apt-get安装都可以,配置好环境变量)  , 我的 ant 版本是1.9.3 , 有的是 1.9.7

  export ANT_HOME=/usr/local/apache-ant-1.9.3
  export PATH=$PATH:$ANT_HOME/bin

  如果提示找不到ant的launcher.ja包,添加环境变量

  export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/jre/lib:$JAVA_HOME/lib/toos.jar:$ANT_HOME/lib/ant-launcher.jar

$ ant -version
Apache Ant(TM) version 1.9.3 compiled on April 8 2014

  我使用的是GitHub 上的 hadoop2x-eclipse-plugin (https://github.com/winghc/hadoop2x-eclipse-plugin)提供的 hadoop2x-eclipse-plugin-master.zip  ( 这是最新的 ,它对应 hadoop2x-eclipse-plugin-2.6.0.zip ), 

  以zip格式下载,然后解压到一个合适的路径下.注意路径的权限和目录所有者是当前用户下的.

  三个编译工具和资源的路径如下: 

eclipse : /home/hadoop/eclipse
hadoop : /usr/local/hadoop-2.7.3
hadoop2x-eclipse-plugin-master : /home/hadoop/hadoop2x-eclipse-plugin-master

2 制作 hadoop-eclipse-plugin-2.7.3.jar

  但是我的hadoop版本是2.7.3,所以需要另外制作插件,好在这篇Github文章提供了怎么build插件的方法。

  解压下载过来的hadoop2x-eclipse-plugin,进入其中目录hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/执行操作, Github文章提供了怎么build插件的方法如下所示: 

eclipse plugin for hadoop 2.x.x
How to build  
[hdpusr@demo hadoop2x-eclipse-plugin]$ cd src/contrib/eclipse-plugin
# Assume hadoop installation directory is /usr/share/hadoop 
[hdpusr@apclt eclipse-plugin]$ ant jar -Dversion=2.4.1 -Dhadoop.version=2.4.1 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop
final jar will be generated at directory
${hadoop2x-eclipse-plugin}/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.4.1.jar

2.1 修改相关文件, 主要有两个,一个是 hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml , 第二个是 hadoop2x-eclipse-plugin-master/ivy/libraries.properties

  但是此时我需要的是 hadoop 2.7.3的 eclilpse 插件,而 github 下载过来的 hadoop2x-eclipse-plugin 配置是 hadoop2.6 的编译环境,所以执行 ant 之前需要需要修改 ant 的 build.xml 配置文件以及相关文件

  第一个文件: hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml , 如下所示:

  1 <?xml version="1.0" encoding="UTF-8" standalone="no"?>
  2 
  3 <!--
  4    Licensed to the Apache Software Foundation (ASF) under one or more
  5    contributor license agreements.  See the NOTICE file distributed with
  6    this work for additional information regarding copyright ownership.
  7    The ASF licenses this file to You under the Apache License, Version 2.0
  8    (the "License"); you may not use this file except in compliance with
  9    the License.  You may obtain a copy of the License at
 10 
 11        http://www.apache.org/licenses/LICENSE-2.0
 12 
 13    Unless required by applicable law or agreed to in writing, software
 14    distributed under the License is distributed on an "AS IS" BASIS,
 15    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 16    See the License for the specific language governing permissions and
 17    limitations under the License.
 18 -->
 19 
 20 <project default="jar" name="eclipse-plugin">
 21 
 22   <import file="../build-contrib.xml"/>
 23 
 24   <path id="eclipse-sdk-jars">
 25     <fileset dir="${eclipse.home}/plugins/">
 26       <include name="org.eclipse.ui*.jar"/>
 27       <include name="org.eclipse.jdt*.jar"/>
 28       <include name="org.eclipse.core*.jar"/>
 29       <include name="org.eclipse.equinox*.jar"/>
 30       <include name="org.eclipse.debug*.jar"/>
 31       <include name="org.eclipse.osgi*.jar"/>
 32       <include name="org.eclipse.swt*.jar"/>
 33       <include name="org.eclipse.jface*.jar"/>
 34 
 35       <include name="org.eclipse.team.cvs.ssh2*.jar"/>
 36       <include name="com.jcraft.jsch*.jar"/>
 37     </fileset> 
 38   </path>
 39 
 40   <path id="hadoop-sdk-jars">
 41     <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
 42       <include name="hadoop*.jar"/>
 43     </fileset> 
 44     <fileset dir="${hadoop.home}/share/hadoop/hdfs">
 45       <include name="hadoop*.jar"/>
 46     </fileset> 
 47     <fileset dir="${hadoop.home}/share/hadoop/common">
 48       <include name="hadoop*.jar"/>
 49     </fileset> 
 50   </path>
 51 
 52 
 53 
 54   <!-- Override classpath to include Eclipse SDK jars -->
 55   <path id="classpath">
 56     <pathelement location="${build.classes}"/>
 57     <!--pathelement location="${hadoop.root}/build/classes"/-->
 58     <path refid="eclipse-sdk-jars"/>
 59     <path refid="hadoop-sdk-jars"/>
 60   </path>
 61 
 62   <!-- Skip building if eclipse.home is unset. -->
 63   <target name="check-contrib" unless="eclipse.home">
 64     <property name="skip.contrib" value="yes"/>
 65     <echo message="eclipse.home unset: skipping eclipse plugin"/>
 66   </target>
 67 
 68  <target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
 69     <echo message="contrib: ${name}"/>
 70     <javac
 71      encoding="${build.encoding}"
 72      srcdir="${src.dir}"
 73      includes="**/*.java"
 74      destdir="${build.classes}"
 75      debug="${javac.debug}"
 76      deprecation="${javac.deprecation}">
 77      <classpath refid="classpath"/>
 78     </javac>
 79   </target>
 80 
 81   <!-- Override jar target to specify manifest -->
 82   <target name="jar" depends="compile" unless="skip.contrib">
 83     <mkdir dir="${build.dir}/lib"/>
 84     <copy  todir="${build.dir}/lib/" verbose="true">
 85           <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
 86            <include name="hadoop*.jar"/>
 87           </fileset>
 88     </copy>
 89     <copy  todir="${build.dir}/lib/" verbose="true">
 90           <fileset dir="${hadoop.home}/share/hadoop/common">
 91            <include name="hadoop*.jar"/>
 92           </fileset>
 93     </copy>
 94     <copy  todir="${build.dir}/lib/" verbose="true">
 95           <fileset dir="${hadoop.home}/share/hadoop/hdfs">
 96            <include name="hadoop*.jar"/>
 97           </fileset>
 98     </copy>
 99     <copy  todir="${build.dir}/lib/" verbose="true">
100           <fileset dir="${hadoop.home}/share/hadoop/yarn">
101            <include name="hadoop*.jar"/>
102           </fileset>
103     </copy>
104 
105     <copy  todir="${build.dir}/classes" verbose="true">
106           <fileset dir="${root}/src/java">
107            <include name="*.xml"/>
108           </fileset>
109     </copy>
110 
111 
112 
113     <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
114     <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
115     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
116     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
117     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
118     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
119     <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
120     <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
121     <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
122     <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
123     <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
124     <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
125     <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
126     <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
127     <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
128 
129     <jar
130       jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar"
131       manifest="${root}/META-INF/MANIFEST.MF">
132       <manifest>
133    <attribute name="Bundle-ClassPath" 
134     value="classes/, 
135  lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
136  lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
137  lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
138  lib/hadoop-auth-${hadoop.version}.jar,
139  lib/hadoop-common-${hadoop.version}.jar,
140  lib/hadoop-hdfs-${hadoop.version}.jar,
141  lib/protobuf-java-${protobuf.version}.jar,
142  lib/log4j-${log4j.version}.jar,
143  lib/commons-cli-${commons-cli.version}.jar,
144  lib/commons-configuration-${commons-configuration.version}.jar,
145  lib/commons-httpclient-${commons-httpclient.version}.jar,
146  lib/commons-lang-${commons-lang.version}.jar,  
147  lib/commons-collections-${commons-collections.version}.jar,  
148  lib/jackson-core-asl-${jackson.version}.jar,
149  lib/jackson-mapper-asl-${jackson.version}.jar,
150  lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
151  lib/slf4j-api-${slf4j-api.version}.jar,
152  lib/guava-${guava.version}.jar,
153  lib/netty-${netty.version}.jar,
154  lib/htrace-core-${htrace.version}.jar"/>
155    </manifest>
156       <fileset dir="${build.dir}" includes="classes/ lib/"/>
157       <!--fileset dir="${build.dir}" includes="*.xml"/-->
158       <fileset dir="${root}" includes="resources/ plugin.xml"/>
159     </jar>
160   </target>
161 
162 </project>
build.xml

  在第81行 找到 <!-- Override jar target to specify manifest --> , 在第82行 找到 <target name="jar" depends="compile" unless="skip.contrib">标签,添加和修改copy子标签标签一下内容, 也就是127行下面, 如下 ( 删除第127行

<copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}.jar"  todir="${build.dir}/lib" verbose="true"/> , 添加下面3行 )

<copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}-incubating.jar"  todir="${build.dir}/lib" verbose="true"/>  
<copy file="${hadoop.home}/share/hadoop/common/lib/servlet-api-${servlet-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
<copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar"  todir="${build.dir}/lib" verbose="true"/> 

  然后找到标签<attribute name="Bundle-ClassPath"  ( 在修改之前的配置文件 build.xml 第133行 )在齐总的value的列表中对应的添加和修改lib,如下 ( 删除第154行 lib/htrace-core-${htrace.version}.jar, 添加下面3行 )

lib/servlet-api-${servlet-api.version}.jar,  
lib/commons-io-${commons-io.version}.jar,  
lib/htrace-core-${htrace.version}-incubating.jar"/> 

  保存退出.注意如果不修改这个,即便你编译完成jar包,放到eclipse中,配置链接会报错的.    修改结果如下所示:

  <!-- Override jar target to specify manifest -->
  <target name="jar" depends="compile" unless="skip.contrib">
    <mkdir dir="${build.dir}/lib"/>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/common">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/hdfs">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/yarn">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>

    <copy  todir="${build.dir}/classes" verbose="true">
          <fileset dir="${root}/src/java">
           <include name="*.xml"/>
          </fileset>
    </copy>



    <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar"  todir="${build.dir}/lib" verbose="true"/>

    <!--my added, 3 lines-->
    <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}-incubating.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/servlet-api-${servlet-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar"  todir="${build.dir}/lib" verbose="true"/> 

    <jar
      jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar"
      manifest="${root}/META-INF/MANIFEST.MF">
      <manifest>
   <attribute name="Bundle-ClassPath" 
    value="classes/, 
 lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
 lib/hadoop-auth-${hadoop.version}.jar,
 lib/hadoop-common-${hadoop.version}.jar,
 lib/hadoop-hdfs-${hadoop.version}.jar,
 lib/protobuf-java-${protobuf.version}.jar,
 lib/log4j-${log4j.version}.jar,
 lib/commons-cli-${commons-cli.version}.jar,
 lib/commons-configuration-${commons-configuration.version}.jar,
 lib/commons-httpclient-${commons-httpclient.version}.jar,
 lib/commons-lang-${commons-lang.version}.jar,  
 lib/commons-collections-${commons-collections.version}.jar,  
 lib/jackson-core-asl-${jackson.version}.jar,
 lib/jackson-mapper-asl-${jackson.version}.jar,
 lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
 lib/slf4j-api-${slf4j-api.version}.jar,
 lib/guava-${guava.version}.jar,
 lib/netty-${netty.version}.jar,
 lib/servlet-api-${servlet-api.version}.jar,  
 lib/commons-io-${commons-io.version}.jar,  
 lib/htrace-core-${htrace.version}-incubating.jar"/>
   </manifest>
      <fileset dir="${build.dir}" includes="classes/ lib/"/>
      <!--fileset dir="${build.dir}" includes="*.xml"/-->
      <fileset dir="${root}" includes="resources/ plugin.xml"/>
    </jar>
  </target>

  但是只是添加和修改这些lib是不行的,hadoop2.6到hadoop2.7中share/hadoop/common/lib/下的jar版本都是有很多不同的,因此还需要修改相应的jar版本. 这个耗费了我半天的时间啊.一个个的对号修改.

  注意这个版本的环境配置文件在hadoop2x-eclipse-plugin-master跟目录的ivy目录下,也就 hadoop2x-eclipse-plugin-master/ivy/libraries.properties 中, ( 将libraries.properties 的 各个选项与 hadoop 各个相应选项( 在 share/hadoop/的几个文件夹下 )的版本号相对应 , )

  为了方便大家,我复制过来,( 直接覆盖原来的即可 ) #覆盖的就是原来的配置 

#   Licensed under the Apache License, Version 2.0 (the "License");
#   you may not use this file except in compliance with the License.
#   You may obtain a copy of the License at
#
#       http://www.apache.org/licenses/LICENSE-2.0
#
#   Unless required by applicable law or agreed to in writing, software
#   distributed under the License is distributed on an "AS IS" BASIS,
#   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#   See the License for the specific language governing permissions and
#   limitations under the License.

#This properties file lists the versions of the various artifacts used by hadoop and components.
#It drives ivy and the generation of a maven POM

# This is the version of hadoop we are generating
#hadoop.version=2.6.0    modify
hadoop.version=2.7.3
hadoop-gpl-compression.version=0.1.0

#These are the versions of our dependencies (in alphabetical order)
apacheant.version=1.7.0
ant-task.version=2.0.10

asm.version=3.2
aspectj.version=1.6.5
aspectj.version=1.6.11

checkstyle.version=4.2

commons-cli.version=1.2
commons-codec.version=1.4
#commons-collections.version=3.2.1    modify
commons-collections.version=3.2.2
commons-configuration.version=1.6
commons-daemon.version=1.0.13
#commons-httpclient.version=3.0.1    modify
commons-httpclient.version=3.1
commons-lang.version=2.6
#commons-logging.version=1.0.4        modify
commons-logging.version=1.1.3
#commons-logging-api.version=1.0.4    modify
commons-logging-api.version=1.1.3
#commons-math.version=2.1    modify
commons-math.version=3.1.1
commons-el.version=1.0
commons-fileupload.version=1.2
#commons-io.version=2.1        modify
commons-io.version=2.4
commons-net.version=3.1
core.version=3.1.1
coreplugin.version=1.3.2

#hsqldb.version=1.8.0.10    modify
hsqldb.version=2.0.0
#htrace.version=3.0.4    modify
htrace.version=3.1.0

ivy.version=2.1.0

jasper.version=5.5.12
jackson.version=1.9.13
#not able to figureout the version of jsp & jsp-api version to get it resolved throught ivy
# but still declared here as we are going to have a local copy from the lib folder
jsp.version=2.1
jsp-api.version=5.5.12
jsp-api-2.1.version=6.1.14
jsp-2.1.version=6.1.14
#jets3t.version=0.6.1    modify
jets3t.version=0.9.0
jetty.version=6.1.26
jetty-util.version=6.1.26
#jersey-core.version=1.8    modify
#jersey-json.version=1.8    modify
#jersey-server.version=1.8    modify
jersey-core.version=1.9
jersey-json.version=1.9
jersey-server.version=1.9
#junit.version=4.5    modify
junit.version=4.11
jdeb.version=0.8
jdiff.version=1.0.9
json.version=1.0

kfs.version=0.1

log4j.version=1.2.17
lucene-core.version=2.3.1

mockito-all.version=1.8.5
jsch.version=0.1.42

oro.version=2.0.8

rats-lib.version=0.5.1

servlet.version=4.0.6
servlet-api.version=2.5
#slf4j-api.version=1.7.5    modify
#slf4j-log4j12.version=1.7.5    modify
slf4j-api.version=1.7.10
slf4j-log4j12.version=1.7.10

wagon-http.version=1.0-beta-2
xmlenc.version=0.52
#xerces.version=1.4.4    modify
xerces.version=2.9.1

protobuf.version=2.5.0
guava.version=11.0.2
netty.version=3.6.2.Final

  修改完成后,大功告成,开始ant

2.2 开始 ant

  进入src/contrib/eclipse-plugin/执行ant命令,如下 ( 权限不够的话,可以切换到 root 权限, 命令 su root )

$ cd /home/hadoop/hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/$ su root$ ant jar -Dhadoop.version=2.7.3 -Declipse.home=/home/hadoop/eclipse -Dhadoop.home=/usr/local/hadoop-2.7.3  

  这个过程第一次会慢点,后来就会很快.

  当最终显示如下,就表示ant制作成功:

Buildfile: /home/hadoop/hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml

check-contrib:

init:
     [echo] contrib: eclipse-plugin

init-contrib:

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = /home/hadoop/hadoop2x-eclipse-plugin-master/ivy/ivysettings.xml

ivy-resolve-common:

ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /home/hadoop/hadoop2x-eclipse-plugin-master/ivy/ivysettings.xml

compile:
     [echo] contrib: eclipse-plugin
    [javac] /home/hadoop/hadoop2x-eclipse-plugin-master/src/contrib/eclipse-plugin/build.xml:76: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds

jar:
     [copy] Copying 1 file to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib
     [copy] Copying /usr/local/hadoop-2.7.3/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib/htrace-core-3.1.0-incubating.jar
     [copy] Copying 1 file to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib
     [copy] Copying /usr/local/hadoop-2.7.3/share/hadoop/common/lib/servlet-api-2.5.jar to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib/servlet-api-2.5.jar
     [copy] Copying 1 file to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib
     [copy] Copying /usr/local/hadoop-2.7.3/share/hadoop/common/lib/commons-io-2.4.jar to /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/lib/commons-io-2.4.jar
      [jar] Building jar: /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.7.3.jar

BUILD SUCCESSFUL
Total time: 4 seconds

  然后你可以切换到 /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/, 下面有我们制作好的插件 hadoop-eclipse-plugin-2.7.3.jar.

3 将自己制作的插件放入到eclipse目录的plugins下, 并配置 Hadoop-Eclipse-Plugin   ( 这里就要求本机上搭建了伪分布式或全分布式 )
       可以参考 使用Eclipse编译运行MapReduce程序 Hadoop2.6.0_Ubuntu/CentOS 

  然后将自己制作的插件放入到eclipse目录的plugins下 

  然后重启eclipse或者shell命令行刷新eclipse如下,同时也可以在shell中显示eclipse的运行过程,以及出错后及时发现原因

$ cp /home/hadoop/hadoop2x-eclipse-plugin-master/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.7.2.jar /home/hadoop/eclipse/plugins/  
$ /home/hadoop/eclipse/eclipse -clean 

  

 

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!