编译hadoop 1.0.3 eclipse plugin jar包

环境:Win 7 32bit

1、修改hadoop-1.0.3\src\contrib\build-contrib.xml,添加<property name="version" value="1.0.3"/>和<property name="eclipse.home" value="d:\\eclipse"/>


2、在hadoop-1.0.3目录下运行ant compile -logfile error.log,会出现错误:

compile-hdfs-classes:
    [javac] D:\hadoop\hadoop-1.0.3\build.xml:576: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 1 source file to D:\hadoop\hadoop-1.0.3\build\classes
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:5: 未结束的字符串字面值
    [javac]                          user="jackdministrator
    [javac]                               ^
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
    [javac] ", date="Sat Jul  7 21:00:12     2012", url="",
    [javac] ^
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
    [javac] ", date="Sat Jul  7 21:00:12     2012", url="",
    [javac]          ^
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 未结束的字符串字面值
    [javac] ", date="Sat Jul  7 21:00:12     2012", url="",
    [javac]                                              ^
    [javac] 4 错误

 

解决方法:修改D:\hadoop\hadoop-1.0.3\src、saveVersion.sh文件,把user=`whoami`改成user=`hadoop`

 

3、再次运行ant命令

D:\hadoop\hadoop-1.0.3\build.xml:618: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "D:\hadoop\hadoop-1.0.3\src\native"): CreateProcess error=2, ?????????
解决方法:在cygwin中安装autoconf automake libtool

 

在安装完软件后依旧报该错误,所以决定使用真实的linux环境进行编译。

 

 

 

环境:CentOS 5.6 x84_64

 

1、修改hadoop-1.0.3\src\contrib\build-contrib.xml,添加<property name="version" value="1.0.3"/>和<property name="eclipse.home" value="/download/eclipse"/>

 

2、在hadoop-1.0.3目录下运行ant compile -logfile error.log,会出现错误:

D:\hadoop\hadoop-1.0.3\build.xml:618: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "D:\hadoop\hadoop-1.0.3\src\native"): CreateProcess error=2, ?????????


解决方法:yum install autoconf automake libtool

 

3、再次编译,顺利通过。切换到hadoop-1.0.3\src\contrib\eclipse-plugin目录,执行ant compile -logfile error.log,顺利通过。

 

如果编译时遇到以下问题:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: /home/hdpusr/workspace/hadoop-1.0.1/ivy/ivy-2.1.0.jar
      [get] Error getting http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar to /home/hdpusr/workspace/hadoop-1.0.1/ivy/ivy-2.1.0.jar

BUILD FAILED
java.net.ConnectException: Connection timed out

 

可以使用命令 ant compile -Doffline=true

 

ant jar前要把以下jar拷贝到相应位置,否则放入eclipse后会报以下异常:

An internal error occurred during: "Map/Reduce location status updater".

 

org/codehaus/jackson/map/JsonMappingException

[hadoop@jack lib]$ pwd

/data/soft/hadoop/build/contrib/eclipse-plugin/lib

[hadoop@jack lib]$ ls

commons-cli-1.2.jar            commons-httpclient-3.0.1.jar  hadoop-core.jar             jackson-mapper-asl-1.8.8.jar

commons-configuration-1.6.jar  commons-lang-2.4.jar          jackson-core-asl-1.8.8.jar

 

Cannot connect to the Map/Reduce location: hadoop

java.io.IOException: Unknown protocol to DataNode: org.apache.hadoop.mapred.JobSubmissionProtocol

at org.apache.hadoop.hdfs.server.datanode.DataNode.getProtocolVersion(DataNode.java:1759)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)

 

编译hadoop 1.0.3 eclipse plugin jar包_第1张图片
 
 

 

4、把hadoop-1.0.3\hadoop-core-1.0.3.jar拷贝到hadoop-1.0.3\build目录下,在hadoop-1.0.3\src\contrib\eclipse-plugin目录中执行ant jar。

 

5、在hadoop-1.0.3\build\contrib\eclipse-plugin目录存放已经编译好的hadoop-eclipse-plugin-1.0.3.jar,详见附件

检查hadoop-eclipse-plugin-1.0.3.jar配置文件,必须包含之前拷贝的jar包


编译hadoop 1.0.3 eclipse plugin jar包_第2张图片
 

拷贝到eclilpse后,在eclipse的菜单栏

Windows-->Preferences-->Hadoop Map/Reduce中配置好hadoop的home目录

 

 

2、在hadoop-1.0.3目录下运行ant compile -logfile error.log,会出现错误:

compile-hdfs-classes:
    [javac] D:\hadoop\hadoop-1.0.3\build.xml:576: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 1 source file to D:\hadoop\hadoop-1.0.3\build\classes
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:5: 未结束的字符串字面值
    [javac]                          user="jackdministrator
    [javac]                               ^
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
    [javac] ", date="Sat Jul  7 21:00:12     2012", url="",
    [javac] ^
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 需要为 class、interface 或 enum
    [javac] ", date="Sat Jul  7 21:00:12     2012", url="",
    [javac]          ^
    [javac] D:\hadoop\hadoop-1.0.3\build\src\org\apache\hadoop\package-info.java:6: 未结束的字符串字面值
    [javac] ", date="Sat Jul  7 21:00:12     2012", url="",
    [javac]                                              ^
    [javac] 4 错误

 

解决方法:修改D:\hadoop\hadoop-1.0.3\src、saveVersion.sh文件,把user=`whoami`改成user=`hadoop`

 

3、再次运行ant命令

D:\hadoop\hadoop-1.0.3\build.xml:618: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "D:\hadoop\hadoop-1.0.3\src\native"): CreateProcess error=2, ?????????
解决方法:在cygwin中安装autoconf automake libtool

 

在安装完软件后依旧报该错误,所以决定使用真实的linux环境进行编译。

 

1、修改hadoop-1.0.3\src\contrib\build-contrib.xml,添加<property name="version" value="1.0.3"/>和<property name="eclipse.home" value="d:\\eclipse"/>

你可能感兴趣的:(eclipse,plugin)