win7搭建hadoop开发环境之编译hadoop-eclipse-xxx.jar插件

  1. 下载软件

    下载hadoop-1.2.1.tar.gz文件,在该压缩包中包含hadoop-eclipse的插件(https://archive.apache.org/dist/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz)

    下载apache-ant-1.9.6-bin.tar.gz文件,用于编译生成插件(http://mirrors.cnnic.cn/apache//ant/binaries/apache-ant-1.9.6-bin.tar.gz)

    下载jdk-7u79-windows-x64.exe文件(http://125.39.35.144/files/41590000063450B0/download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-windows-x64.exe)

    下载eclipse-java-juno-SR2-win32-x86_64.zip文件(http://mirror.bit.edu.cn/eclipse/technology/epp/downloads/release/juno/SR2/eclipse-java-juno-SR2-win32-x86_64.zip)

  2. 安装软件

    除jdk需要运行安装程序进行安装外,eclipse和apache-ant均为只要解压缩并配置好环境变量即可,hadoop只需要解压缩即可!本文中的目录如下:

    jdk:D:\Program Files\Java\jdk1.7.0_79

    hadoop:D:\hadoop-1.2.1\hadoop-1.2.1

    eclipse:D:\Program Files\eclipse-java-juno-SR2-win32-x86_64\eclipse

    apache-ant:D:\Program Files\apache-ant-1.9.6-bin\apache-ant-1.9.6

  3. 配置环境变量

      右键点击“计算机”,点击“属性”,将出现下图窗体

           wKioL1Zqa5nRXGk5AABIc3soJ9A446.png

      点击“高级系统设置”后,将出现如下窗口

           wKioL1Zqa_ng2awIAABQwiEHoQw761.png

       点击“环境变量”后,将出现如下窗口

            wKioL1ZqbDPiCXfGAACHrFLvsS4178.png

       如上图所示,要新建三个用户变量(ANT_HOME  |  CLASSPATH  |  JAVA_HOME)

        ANT_HOME:D:\Program Files\apache-ant-1.9.6-bin\apache-ant-1.9.6

        CLASSPATH:.;%JAVA_HOME%\lib

        JAVA_HOME:D:\Program Files\Java\jdk1.7.0_79

       修改系统变量“Path”的值(选中Path行,点击下方“编辑”按钮,将光标移动到行首,然后插入

       如下内容)

        .;D:\Program Files\apache-ant-1.9.6-bin\apache-ant-1.9.6\bin;D:\Program Files\Java\jdk1.7.0_79\bin;

       保存即可!

   4. 修改编译配置文件

        在hadoop解压目录下,打开src\contrib\eclipse-plugin\build.xml,修改以下几行:

        注:绿色部分为新增部分,淡蓝色部分为修改部分

        <project default="jar" name="eclipse-plugin">
        <import file="../build-contrib.xml"/>
       <path id="eclipse-sdk-jars">
          <fileset dir="${eclipse.home}/plugins/">
          <include name="org.eclipse.ui*.jar"/>
          <include name="org.eclipse.jdt*.jar"/>
          <include name="org.eclipse.core*.jar"/>
          <include name="org.eclipse.equinox*.jar"/>
          <include name="org.eclipse.debug*.jar"/>
          <include name="org.eclipse.osgi*.jar"/>
          <include name="org.eclipse.swt*.jar"/>
          <include name="org.eclipse.jface*.jar"/>
          <include name="org.eclipse.team.cvs.ssh2*.jar"/>
          <include name="com.jcraft.jsch*.jar"/>
          </fileset> 
      </path>

      <path id="hadoop-core-jar">
        <fileset dir="${hadoop.root}/">
        <include name="hadoop*.jar"/>
        </fileset>
      </path>

      <!-- Override classpath to include Eclipse SDK jars -->
      <path id="classpath">
        <pathelement location="${build.classes}"/>
        <pathelement location="${hadoop.root}/build/classes"/>
        <path refid="eclipse-sdk-jars"/>

        <path refid="hadoop-core-jar"/>

      </path>

      <!-- Skip building if eclipse.home is unset. -->
      <target name="check-contrib" unless="eclipse.home">
            <property name="skip.contrib" value="yes"/>
           <echo message="eclipse.home unset: skipping eclipse plugin"/>
      </target>

     <target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
        <echo message="contrib: ${name}"/>
        <javac
         encoding="${build.encoding}"
         srcdir="${src.dir}"
         includes="**/*.java"
         destdir="${build.classes}"
         debug="${javac.debug}"
         deprecation="${javac.deprecation}">
         <classpath refid="classpath"/>
        </javac>
      </target>

      <!-- Override jar target to specify manifest -->
      <target name="jar" depends="compile" unless="skip.contrib">
        <mkdir dir="${build.dir}/lib"/>

        <copy file="${hadoop.root}/hadoop-core-${version}.jar" tofile="${build.dir}/lib/hadoop-core.jar" verbose="true"/>
        <copy file="${hadoop.root}/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>

        <copy file="${hadoop.root}/lib/commons-cli-1.2.jar"  todir="${build.dir}/lib" verbose="true"/>  
        <copy file="${hadoop.root}/lib/commons-lang-2.4.jar"  todir="${build.dir}/lib" verbose="true"/>  
        <copy file="${hadoop.root}/lib/commons-configuration-1.6.jar"  todir="${build.dir}/lib" verbose="true"/>  
        <copy file="${hadoop.root}/lib/jackson-mapper-asl-1.8.8.jar"  todir="${build.dir}/lib" verbose="true"/>  
        <copy file="${hadoop.root}/lib/jackson-core-asl-1.8.8.jar"  todir="${build.dir}/lib" verbose="true"/>  
        <copy file="${hadoop.root}/lib/commons-httpclient-3.0.1.jar"  todir="${build.dir}/lib" verbose="true"/>

         <jar
          jarfile="${build.dir}/hadoop-${name}-${version}.jar"
          manifest="${root}/META-INF/MANIFEST.MF">
          <fileset dir="${build.dir}" includes="classes/ lib/"/>
          <fileset dir="${root}" includes="resources/ plugin.xml"/>
        </jar>
      </target>

    </project>


        编辑src\contrib\build-contrib.xml

          <property name="version" value="1.2.1"/>
          <property name="ivy.version" value="2.1.0"/>

          <property name="eclipse.home" location="...ECLIPSE_HOME..."/>

        注意将ECLIPSE_HOME换成实际的eclipse的安装目录,本文为D:\Program Files\eclipse-java-juno-SR2-win32-x86_64\eclipse

   5. 编译插件

        打开命令行,进入目录...src\contrib\eclipse-plugin,输入ant编译。编译成功如下图

        wKiom1ZqcuaRPgHuAACFEJ6F63A864.png

        编译后的jar插件文件在{hadoop_home}\build\contrib\eclipse-plugin路径下。

    6. 修改插件

        1)在D盘新建文件夹alter_plugin(D:\alter_plugin)

        2)将以上编译好的插件复制到上一步的目录下

        3)鼠标右键点击插件,使用winrar进行解压缩

        4)编辑D:\alter_plugin\hadoop-eclipse-plugin-1.2.1\META-INF\MANIFEST.MF(绿色修改)

            Eclipse-LazyStart: true 

            Bundle-ClassPath: classes/,lib/hadoop-core.jar,lib/commons-cli-1.2.jar,

            lib/commons-configuration-1.6.jar,lib/commons-httpclient-3.0.1.jar,

            lib/commons-lang-2.4.jar,jackson-core-asl-1.8.8.jar,jackson-mapper-asl-1.8.8.jar
            Bundle-Vendor: Apache Hadoop

        5)cmd命令行进入到D:\alter_plugin重新打包jar插件包

            将D:\alter_plugin\hadoop-eclipse-plugin-1.2.1\META-INF\MANIFEST.MF 复制到

            D:\alter_plugin\MANIFEST.MF

            命令行执行  jar cvfm hadoop-eclipse-plugin-1.2.1.jar MANIFEST.MF -C hadoop-eclipse-plugin-1.2.1/ .

            即可在D:\alter_plugin目录下生成编译好的hadoop-eclipse-plugin-1.2.1.jar插件!

    6. 安装hadoop-eclipse插件

        将该插件复制到eclipse安装目录下的plugins目录下

        本文目录为:D:\Program Files\eclipse-java-juno-SR2-win32-x86_64\eclipse\plugins

    7. 启动eclipse

        Window -> Open Perspective -> Other ->选择Map/Reduce(蓝色的象)

        在eclipse控制台旁边会多一个Tab,叫“Map/Reduce Locations”,在空白的地方点右键,选择“New Hadoop location...”,如图所示:

                            

        Location name(取个名字)
        Map/Reduce Master(根据mapred-site.xml中配置的mapred.job.tracker来填写)
        DFS Master(根据core-site.xml中配置的fs.default.name来填写)

        wKiom1ZqeTnBVrfAAABOmEAwi4U613.png

        至此,左侧“Project Explorer”中将出现配置好的HDFS,点击右键,可以进行新建文件夹、删除文件夹、上传文件、下载文件等操作。

        注意:若操作完成之后,在eclipse中没有立即显示,右键 刷新即可。


    至此,编译hadoop-eclipse插件及安装到eclipse已经完成!如文中有纰漏,欢迎指正!


你可能感兴趣的:(eclipse,hadoop,插件)