Linux编译Hadoop源码优先使用

1.配置yum 

2.安装JDK

2.1上传

2.2解压jdk

#创建文件夹

mkdir /usr/java

#解压

tar -zxvf   jdk-7u79-linux-i586.tar.gz  -C /usr/java/

2.3将java添加到环境变量中

vim /etc/profile

#在文件最后添加

JAVA_HOME=/usr/java/jdk1.7.0_79

export PATH=$JAVA_HOME/bin:$PATH

#刷新配置

source /etc/profile


3.源码编译hadoop2.7.0

3.1 上传hadoop2.7.0源码


3.2 安装编辑源码程序依赖组件

 yum -y install  svn   ncurses-devel   gcc*  lzo-devel zlib-devel autoconf    automake    libtool    openssl�Cdevel 


3.3 安装ant并配置环境变量

mkdir /usr/ant

tar -zxvf apache-ant-1.9.4-bin.tar.gz -C /usr/ant

vim /etc/profile

 ANT_HOME=/usr/ant/apache-ant-1.9.4

 export PATH=$PATH:$ANT_HOME/bin

source  /etc/profile

ant -version --查看是否安装成功


3.4 安装 findbugs并配置环境变量

mkdir /usr/findbugs

     ar -zxvf findbugs-3.0.0.tar.gz -C /usr/findbugs/

vim /etc/profile

     FINDBUGS_HOME=/usr/findbugs/findbugs-3.0.0

export PATH=$PATH:$FINDBUGS_HOME/bin

      source  /etc/profile


3.5 安装protobuf

tar -zxvf protobuf-2.5.0.tar.gz

cd protobuf-2.5.0

./configure --prefix=/usr/local

      make && make install

     protoc --version --查看是否安装成功


3.6 安装maven并配置环境变量

tar -zxvf apache-maven-3.2.3-bin.tar.gz  -C /usr/maven/

vim /etc/profile

MAVEN_HOME=/usr/maven/apache-maven-3.2.3

    export PATH=$PATH:$MAVEN_HOME/bin

source /etc/profile

     mvn -version --查看是否安装成功



3.7安装Cmake

下载地址:http://www.cmake.org/cmake/resources/software.html

安装前提

系统中已经安装了g++和ncurses-devel,如果没有安装使用下面的命令安装

[root@admin /]# yum install gcc-c++

[root@admin /]# yum install ncurses-devel


将cmake-2.8.10.2.tar.gz文件上传到/usr/local中执行以下操作:

[root@admin local]# cd /usr/local

[root@admin local]# tar -zxv -f cmake-2.8.10.2.tar.gz       // 解压压缩包 

[root@admin local]# rm -rf cmake-2.8.10.2.tar.gz   // 删除压缩包 

[root@admin local]# cd cmake-2.8.10.2

[root@localhost cmake-2.8.10.2]# ./configure

[root@localhost cmake-2.8.10.2]# make

[root@localhost cmake-2.8.10.2]# make install

[root@admin local]# mv cmake-2.8.10.2 cmake  // 修改文件夹名


添加环境变量

用vi在文件/etc/profile文件中增加变量,使其永久有效:

[root@admin local]# vi /etc/profile   // 修改环境变量


在文件末尾追加以下两行代码:

CMAKE_HOME=/opt/cmake/bin:$PATH

export $CMAKE_HOME


然后执行以下操作:

[root@admin local]# source /etc/profile   //使修改生效


检验cmake安装

[root@admin local]# cmake --version

cmake version 2.8.10.2


3.8 编译安装hadoop2.7.0

tar -zxvf hadoop-2.7.0-src.tar.gz -c /usr/hadoop-2.7.0-src/

cd /hadoop-2.7.0-src

#在编译之前先设置maven内存 

export MAVEN_OPTS="-Xmx512m -XX:MaxPermSize=128m"

#然后编译

mvn clean package -Pdist,native -DskipTests -Dtar

编译完成之后,因为是在64位系统下编译的

[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---

[INFO] Building jar: /opt/hadoop2/hadoop-dist/target/hadoop-dist-2.7.1-javadoc.jar

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary:

[INFO] 

[INFO] Apache Hadoop Main ................................. SUCCESS [ 30.630 s]

[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 13.872 s]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 49.006 s]

[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  2.264 s]

[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  7.693 s]

[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 22.643 s]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 24.459 s]

[INFO] Apache Hadoop Auth ................................. SUCCESS [ 33.211 s]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 15.207 s]

[INFO] Apache Hadoop Common ............................... SUCCESS [11:49 min]

[INFO] Apache Hadoop NFS .................................. SUCCESS [ 40.371 s]

[INFO] Apache Hadoop KMS .................................. SUCCESS [01:00 min]

[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.229 s]

[INFO] Apache Hadoop HDFS ................................. SUCCESS [15:13 min]

[INFO] Apache Hadoop HttpFS ............................... SUCCESS [02:58 min]

[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 59.677 s]

[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 27.884 s]

[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.319 s]

[INFO] hadoop-yarn ........................................ SUCCESS [  0.288 s]

[INFO] hadoop-yarn-api .................................... SUCCESS [07:33 min]

[INFO] hadoop-yarn-common ................................. SUCCESS [04:53 min]

[INFO] hadoop-yarn-server ................................. SUCCESS [  2.889 s]

[INFO] hadoop-yarn-server-common .......................... SUCCESS [01:11 min]

[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [01:28 min]

[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 17.829 s]

[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 46.193 s]

[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [02:04 min]

[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 52.011 s]

[INFO] hadoop-yarn-client ................................. SUCCESS [ 36.440 s]

[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 17.792 s]

[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.328 s]

[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 15.388 s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  9.539 s]

[INFO] hadoop-yarn-site ................................... SUCCESS [  0.274 s]

[INFO] hadoop-yarn-registry ............................... SUCCESS [ 34.365 s]

[INFO] hadoop-yarn-project ................................ SUCCESS [ 22.616 s]

[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.617 s]

[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [01:59 min]

[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [01:40 min]

[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 31.093 s]

[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 43.278 s]

[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 31.668 s]

[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 47.833 s]

[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  9.745 s]

[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 27.885 s]

[INFO] hadoop-mapreduce ................................... SUCCESS [ 14.473 s]

[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 33.683 s]

[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 48.133 s]

[INFO] Apache Hadoop Archives ............................. SUCCESS [ 11.727 s]

[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 32.375 s]

[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 18.502 s]

[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 14.527 s]

[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  6.478 s]

[INFO] Apache Hadoop Extras ............................... SUCCESS [ 15.328 s]

[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 32.670 s]

[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 22.951 s]

[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [02:10 min]

[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 37.904 s]

[INFO] Apache Hadoop Client ............................... SUCCESS [ 50.257 s]

[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.188 s]

[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 31.386 s]

[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 36.401 s]

[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.142 s]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [03:02 min]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 01:17 h

[INFO] Finished at: 2015-11-11T16:41:59-08:00

[INFO] Final Memory: 110M/494M



在/opt/hadoop-2.7.0-src/hadoop-dist/target/ 目录下:

所以编译出来的代码包是64位版本的;可以直接将/opt/hadoop2/hadoop-dist/target目录下的hadoop-2.7.0或者hadoop-2.7.0.tar.gz拷贝到其他linux64位平台进行搭建hadoop集群


注意:64位官方没有发布编译后的包,而生产环境都是用都是64位机器,用64位的,所以必须自已编译后发布。


你可能感兴趣的:(it)