网上下了一些 hadoop-eclipse-plugin-1.1.2.jar,都不太好用,插件没转上,于是自己ant编译一个jar,步骤如下:
1. 修改${hadoop.root}/src/contrib目录的build-contrib.xml文件,添加eclipse路径和hadoop版本信息:
<property name="eclipse.home" location="D:/work/installfile/eclipse/eclipse" /> <property name="version" value="1.1.2"/>
2. 修改${hadoop.root}/src/contrib/eclipse-plugin目录下的build.xml文件,在id为classpath的path节点添加hadoop-core的jar依赖
<!-- Override classpath to include Eclipse SDK jars --> <path id="classpath"> <pathelement location="${build.classes}"/> <pathelement location="${hadoop.root}/build/classes"/> <!-- add core jar --> <pathelement location="${hadoop.root}/hadoop-core-1.1.2.jar"/> <path refid="eclipse-sdk-jars"/> </path>
3. 找到name为jar的target,将相应的jar文件打包进插件的lib目录
<!-- Override jar target to specify manifest --> <target name="jar" depends="compile" unless="skip.contrib"> <mkdir dir="${build.dir}/lib"/> <copy file="${hadoop.root}/hadoop-core-${version}.jar" tofile="${build.dir}/lib/hadoop-core.jar" verbose="true"/> <copy file="${hadoop.root}/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/> <!-- add these follow jars --> <copy file="${hadoop.root}/lib/commons-lang-2.4.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.root}/lib/commons-configuration-1.6.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.root}/lib/jackson-mapper-asl-1.8.8.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.root}/lib/jackson-core-asl-1.8.8.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.root}/lib/commons-httpclient-3.0.1.jar" todir="${build.dir}/lib" verbose="true"/> <jar jarfile="${build.dir}/hadoop-${name}-${version}.jar" manifest="${root}/META-INF/MANIFEST.MF"> <fileset dir="${build.dir}" includes="classes/ lib/"/> <fileset dir="${root}" includes="resources/ plugin.xml"/> </jar> </target>
4. 修改MANIFEST.MF文件里面Bundle-ClassPath属性值
Bundle-ClassPath: classes/,lib/hadoop-core.jar,lib/commons-cli-1.2.jar,lib/commons-configuration-1.6.jar,lib/commons-httpclient-3.0.1.jar,lib/commons-lang-2.4.jar,lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar
5. 在命令行进入 ${hadoop.root}/src/contrib/eclipse-plugin目录,输入ant命令打包
6. 最后在${hadoop.root}/build/contrib/eclipse-plugin目录生成打包好的插件,将hadoop-eclipse-plugin-1.1.2.jar文件复制到eclipse的plugins目录
7. 重启eclipse,然后 window-> preferences 中看到有 Hadoop Map/Reduce 目录,点击,配置hadoop的安装目录
8. 选择 window -> show view -> other -> MapReduce Tools -> Map/Reduce Locations,弹出Map/Reduce Locations 的界面,右击,选择New Hadoop location,弹出对话框,输入location name,如hadoop,
配置Map/Reduce Master, host写入jobtracker的ip地址,port写入jobtracker的端口,默认9001;
配置DFS master, host 写入namenode的ip地址,port写入namenode端口,默认9000;
这2个配置是在集群的mapred-site.xml、core-site.xml中配置的地址及端口。
user name处写入你在hadoop集群中运行的用户名。
点击finish
9. 点击DFS Locations-->Hadoop如果能显示文件夹(N)说明配置正确,N为数字,如果显示"拒绝连接",请检查你的配置
本文内容参考自: http://www.cnblogs.com/chenying99/archive/2013/05/09/3069228.html
http://www.cnblogs.com/flyoung2008/archive/2011/12/09/2281400.html