hadoop2.6.5在CentOS 7下的源码编译(成功编译)

1.编译环境

我使用的是Mac电脑,通过 VMware安装CentOS 7,这个CentOS是简约版

2.需要的一些包

1)首先需要下载hadoop的源码包可以去官网下载,也可以在我的百度云盘下载。
2)下载完成解压,并打开BUILDING.txt文件,该文件详细说明了build注意事项、以及需要的条件

Build instructions for Hadoop

----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

----------------------------------------------------------------------------------

通过说明介绍可以看到hadoop编译所需要的一些东西JDK、Maven、Findbugs、ProtocolBuffer、CMake、Zlib devel、openssl devel,其中的一些工具包可以从它们可以的官网下载,想要方便一点的可以在我的百度云盘下载(不过这个只是针对我的hadoop编译需要的版本),下图是我下载使用的一些工具库。

1.png

3.准备编译环境

3.1安装JDK

首先可以使用 java -version 查看自己时候有装JDK,如果有了可以忽略。
1.通过sftp上传jdk-7u45-linux-x64.tar.gz到虚拟机上,通过命令 put + 需要上传的文件路径 + 最终保存的路径

put /Users/mr.gcy/Desktop/jdk-7u45-linux-x64.tar.gz  /home/hadoop/software/

2.通过 tar -zxvf jdk-7u45-linux-x64.tar.gz -C 解压输出的文件路径 (有的是以.tar.gz结尾解压需要加 "z",有的是以.tar结尾的,就不需要加了 tar -xvf xxx.tar)

tar -zxvf jdk-7u45-linux-x64.tar.gz -C /home/hadoop/app

3.配置到环境变量中

#1.打开文件
vim /etc/profile 
#2.配置
export JAVA_HOME=/home/hadoop/app/jdk1.7.0_45
export PATH=$PATH:$JAVA_HOME/bin
#3.刷新配置
source /etc/profile 

4.测试一下
使用 java -version,看到如下结果就成功了

java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)
3.2安装maven

同上

put /Users/mr.gcy/Desktop/apache-maven-3.5.4-bin.tar  /home/hadoop/software/

tar -xvf apache-maven-3.5.4-bin.tar -C /home/hadoop/app

export MAVEN_HOME=/home/hadoop/app/apache-maven-3.5.4
export PATH=$PATH:$MAVEN_HOME/bin

source /etc/profile 

测试
使用 mvn -v 命令进行验证

Apache Maven 3.5.4 (1edded0938998edf8bf061f1ceb3cfdeccf443fe; 2018-06-17T14:33:14-04:00)
Maven home: /home/hadoop/app/apache-maven-3.5.4
Java version: 1.7.0_45, vendor: Oracle Corporation, runtime: /home/hadoop/app/jdk1.7.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.10.0-693.el7.x86_64", arch: "amd64", family: "unix"
3.2安装ant
put /Users/mr.gcy/Desktop/apache-ant-1.10.5-bin.tar  /home/hadoop/software/

tar -xvf apache-ant-1.10.5-bin.tar -C /home/hadoop/app

export ANT_HOME=/home/hadoop/app/apache-ant-1.10.5
export PATH=$PATH:$ANT_HOME/bin

source /etc/profile 

测试
使用 ant -version 命令进行验证

遇到的问题

ant 执行报如下错误

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/tools/ant/launch/Launcher : Unsupported major.minor version 52.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)

网上找了一下链接说是apache-ant-1.10.5版本太高,因为自己本地的jdk版本比较低,我重新下载了apache-ant-1.9.13-bin.tar官网,重复上述的步骤
然后在验证

Apache Ant(TM) version 1.9.13 compiled on July 10 2018

这样就好了

3.3安装findbugs
put /Users/mr.gcy/Desktop/findbugs-3.0.1.tar  /home/hadoop/software/

tar -xvf findbugs-3.0.1.tar -C /home/hadoop/app

export FINDBUGS_HOME=/home/hadoop/app/findbugs-3.0.1
export PATH=$PATH:$FINDBUGS_HOME/bin

source /etc/profile 

测试
使用 findbugs -version 命令进行验证

3.0.1
3.4安装protobuf

protobuf是源码,需要编译

put /Users/mr.gcy/Desktop/protobuf-2.5.0.tar  /home/hadoop/software/

tar -xvf protobuf-2.5.0.tar -C /home/hadoop/app

cd protobuf-2.5.0

#注:使用root权限
./configure

make 

make check

make install

中间会出现报错,是由于一些 c和c++的一些库没有装导致的可以用yum安装

错误1

configure: error: C++ preprocessor “/lib/cpp” fails sanity

解决办法:出现该情况是由于c++编译器的相关package没有安装,以root用户登陆,在终端上执行:
yum install gcc-c++ 
错误2

configure: error: no acceptable C compiler found in $PATH

解决方法:yum install gcc

测试
使用 protoc --version 命令进行验证

libprotoc 2.5.0
3.5安装一些依赖

安装 cmake,openssl-devel,ncurses-devel 依赖包(root 用户且能够连上互联网)

yum install cmake 

yum install openssl-devel 

yum install ncurses-devel

至此我们基本的编辑环境就搭建起来的,现在开始编译hadoop源码

4.编译代码

cd 到解压的hadoop-2.6.5-src的文件夹里面
只需要一句命令就可以了

 mvn package -Pdist,native -DskipTests

最终成功的结果

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /home/hadoop/software/hadoop-2.6.5-src/hadoop-dist/target/hadoop-dist-2.6.5-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main 2.6.5 ........................... SUCCESS [  7.019 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  4.556 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.726 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  7.971 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.597 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.873 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 12.132 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 13.992 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  9.930 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  4.390 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [02:05 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 16.985 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 13.119 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.065 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [05:21 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [04:10 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [01:18 min]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  5.528 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.050 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.064 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:44 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 38.664 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.107 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 16.402 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [01:47 min]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  6.420 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 10.122 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 30.504 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  9.103 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 11.551 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.111 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  4.423 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  2.741 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.074 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  8.325 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  4.260 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.236 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 31.826 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 23.417 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  6.378 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 14.202 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 12.333 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 22.664 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  2.819 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  9.880 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  3.038 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 15.481 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 20.245 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  3.342 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  9.251 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  6.322 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  4.324 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.920 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  4.198 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  1.813 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  7.628 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [  7.970 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  7.313 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  2.070 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  7.857 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  6.261 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.041 s]
[INFO] Apache Hadoop Distribution 2.6.5 ................... SUCCESS [ 14.317 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:49 min
[INFO] Finished at: 2018-07-26T15:24:27-04:00
[INFO] ------------------------------------------------------------------------

然后在/home/hadoop/software/hadoop-2.6.5-src/hadoop-dist/target/中就可看到编译后的包了
编译成功后该目录的文件其中hadoop-2.6.5这个就是最终的包

drwxr-xr-x. 2 root root        28 Jul 26 11:35 antrun
drwxr-xr-x. 3 root root        22 Jul 26 11:35 classes
-rw-r--r--. 1 root root      1878 Jul 26 15:24 dist-layout-stitching.sh
drwxr-xr-x. 9 root root       149 Jul 26 15:24 hadoop-2.6.5
-rw-r--r--. 1 root root     26207 Jul 26 15:24 hadoop-dist-2.6.5.jar
-rw-r--r--. 1 root root 184511395 Jul 26 15:24 hadoop-dist-2.6.5-javadoc.jar
-rw-r--r--. 1 root root     23761 Jul 26 15:24 hadoop-dist-2.6.5-sources.jar
-rw-r--r--. 1 root root     23761 Jul 26 15:24 hadoop-dist-2.6.5-test-sources.jar
drwxr-xr-x. 2 root root        51 Jul 26 15:24 javadoc-bundle-options
drwxr-xr-x. 2 root root        28 Jul 26 15:24 maven-archiver
drwxr-xr-x. 3 root root        22 Jul 26 11:35 maven-shared-archive-resources
drwxr-xr-x. 3 root root        22 Jul 26 11:35 test-classes
drwxr-xr-x. 2 root root         6 Jul 26 11:35 test-dir

5.编译过程中遇到的错

编译过程是不可能一帆风顺的,有好多问题等着去解决

5.1错误1
[ERROR] Unresolveable build extension: Plugin org.apache.felix:maven-bundle-plugin:2.4.0 or one of its dependencies could not be resolved: The following artifacts could not be resolved: biz.aQute.bnd:bndlib:jar:2.1.0, org.osgi:org.osgi.core:jar:4.2.0, 
org.apache.felix:org.apache.felix.bundlerepository:jar:1.6.6, org.easymock:easymock:jar:2.4, org.codehaus.plexus:plexus-interpolation:jar:1.15, org.apache.maven.shared:maven-dependency-tree:jar:2.1, org.codehaus.plexus:plexus-component-annotations:jar:1.5.5, org.eclipse.aether:aether-util:jar:0.9.0.M2: Could not transfer artifact biz.aQute.bnd:bndlib:jar:2.1.0 from/to central (http://repo.maven.apache.org/maven2): Connection to http://repo.maven.apache.org refused: 连接超时 -> [Help 2]
repo.maven.apache.org timeout

这是有于我们用的maven它的中央仓库是在国外,如果没有翻墙是下载不下来的,所以我们可以cd /home/hadoop/app/apache-maven-3.5.4-bin/conf/setting.xml进行修改
我找了两个源


          alimaven
          aliyun maven
          http://maven.aliyun.com/nexus/content/groups/public/
          central        
 

  
     UK  
     UK Central  
     http://uk.maven.org/maven2  
     central  
  

一个是UK,在编译过程中好像有个东西没有下载下来,我以为是源的问题,又切换到阿里的源,但是后面还是出现同样的问题,不过最终还是解决了(主要是由于hadoop的依赖库一些是下载特别慢导致超时,所以要多试几次)

5.2错误2
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /app/compile/hadoop-2.6.4-src/hadoop-project/target/antrun/build-main.xml (Permission denied) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.

这个是由于没有在root权限下编译,所以我们需要切换到root帐号下再进行编译

5.3错误3
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-dist: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...... @ 38:108 in /home/hadoop/software/hadoop-2.6.5-src/hadoop-dist/target/antrun/build-main.xml
[ERROR] -> [Help 1]

这个问题就是在下载tomcat的时候,下载不下来最后失败了是编译到Apache Hadoop KMS出现的
解决方案:
1.手动下载tomcat,不过这块的版本需要自己去/hadoop-2.6.5-src/hadoop-common-project/hadoop-kms/target/antrun/build-main.xml下面找自己需要下载的tomcat版本,而且里面还有下载地址
2.将下载好的再放到/hadoop-2.6.5-src/hadoop-common-project/hadoop-kms/downloads目录里面即可

5.4错误4

经常会出现这种错

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-dist: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...... @ 38:108 in /home/hadoop/software/hadoop-2.6.5-src/hadoop-dist/target/antrun/build-main.xml
[ERROR] -> [Help 1]

这个可能是由于网络原因导致编译过程中有些库下载不下来造成的,所以再编译的时候多试几次

整个过程历时一天半,虽然中间过程比较艰难但最终的结果是好的。
参考文章
https://blog.csdn.net/ilovemilk/article/details/44465487
https://blog.csdn.net/tototuzuoquan/article/details/72777190

你可能感兴趣的:(hadoop2.6.5在CentOS 7下的源码编译(成功编译))