Hadoop2.6.4版本64位编译

文章目录

  • 1.安装过程中遇到的问题
  • 2.安装依赖包
    • 2.1安装jdk1.7版本
    • 2.2安装配置maven
    • 2.3安装Ant
    • 2.4安装protobuf
    • 2.5安装依赖包
    • 2.6下载Hadoop源码
  • 3.编译Hadoop
    • 3.1编译Hadoop
    • 3.2验证编译是否成功

Hadoop2.X版本的64位编译

本文以Hadoop2.6.4版本的64位编译为例,虚拟机系统环境为CentOS
为了解决执行hadoop命令时总有警告:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable

1.安装过程中遇到的问题

先来说下安装过程中出现的问题

1.由于系统环境安装的是jdk1.8版本,出现错误

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs) on project hadoop-annotations: MavenReportException: Error while creating archive:
[ERROR] Exit code: 1 - /app/compile/hadoop-2.6.4-src/hadoop-common-project/hadoop-annotations/src/main/java/org/apache/hadoop/classification/InterfaceStability.java:27: error: unexpected end tag: 
[ERROR] * 
[ERROR] ^
[ERROR] 
[ERROR] Command line was: /usr/local/jdk/jre/../bin/javadoc @options @packages
[ERROR] 
[ERROR] Refer to the generated Javadoc files in '/app/compile/hadoop-2.6.4-src/hadoop-common-project/hadoop-annotations/target' dir.

本机尝试在jdk1.8环境下将org/apache/hadoop/classification/InterfaceStability.java:27文件中的去除掉进行编译,后续报错

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:2.8.1:jar (module-javadocs) on project hadoop-nfs: MavenReportException: Error while creating archive:
[ERROR] Exit code: 1 - /app/compile/hadoop-2.6.4-src/hadoop-common-project/hadoop-nfs/src/main/java/org/apache/hadoop/mount/MountdBase.java:51: warning: no description for @param
[ERROR] * @param program
[ERROR] ^
[ERROR] /app/compile/hadoop-2.6.4-src/hadoop-common-project/hadoop-nfs/src/main/java/org/apache/hadoop/mount/MountdBase.java:52: warning: no description for @throws
[ERROR] * @throws IOException
[ERROR] ^

需要更换jdk为1.7版本,编译Hadoop2.6.4需要在jdk1.7环境下进行

2.需要用root账号进行编译

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /app/compile/hadoop-2.6.4-src/hadoop-project/target/antrun/build-main.xml (Permission denied) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.

2.安装依赖包

2.1安装jdk1.7版本

本机安装jdk1.7.0_79版本,参见Linux下安装JDK

2.2安装配置maven

下载maven安装包(建议安装3.0以上版本)本机安装选择的是maven3.3.9的二进制包

tar -zxvf apache-maven-3.3.9-bin.tar.gz -C /usr/local/
vi /etc/profile
export MAVEN_HOME=/usr/local/apache-maven-3.3.9
export PATH=$PATH:$MAVEN_HOME/bin
source /etc/profile

[root@cyyun lib]# mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /usr/local/apache-maven-3.3.9
Java version: 1.7.0_79, vendor: Oracle Corporation
Java home: /usr/local/jdk/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"

此时maven安装成功
设置maven的settings.xml
这里指定阿里云的maven仓库,会大大提高包的下载速度,毕竟中央仓库在国外


	nexus-aliyun
	*
	Nexusaliyun
	http://maven.aliyun.com/nexus/content/groups/public

2.3安装Ant

下载Ant1.9.8版本,1.10.x版本需要jdk1.8的环境,暂不使用

tar -zxvf apache-ant-1.9.8-bin.tar.gz -C /usr/local/
vi /etc/profile
export ANT_HOME=/usr/local/apache-ant-1.9.8
export PATH=$PATH:$ANT_HOME/bin
source /etc/profile

[root@cyyun ~]# ant -version
Apache Ant(TM) version 1.9.8 compiled on December 25 2016

此时Ant安装成功

2.4安装protobuf

参见protobuf-2.5.0的下载与安装

Hadoop使用protocol buffer进行通信,需要下载和安装protobuf-2.5.0.tar.gz。由于现在 protobuf-2.5.0.tar.gz已经无法在官网中下载了,本人将protobuf-2.5.0.tar.gz上传到百度云盘供大家下载,地址: http://pan.baidu.com/s/1pJlZubT

tar -zxvf protobuf-2.5.0.tar.gz -C /usr/local/
cd /usr/local/protobuf-2.5.0
./configure

make
make check
make install

验证是否安装成功

[root@cyyun ~]# protoc --version
libprotoc 2.5.0

此时protobuf安装成功

2.5安装依赖包

yum install autoconf automake libtool cmake ncurses-devel 
openssl-devel gcc gcc-c++ svn 

svn可能不需要,安装上也无影响

2.6下载Hadoop源码

下载Hadoop2.6.4的源码包hadoop-2.6.4-src.tar.gz,将其解压

tar -zxvf hadoop-2.6.4-src.tar.gz -C /app/compile/

3.编译Hadoop

3.1编译Hadoop

在Hadoop源代码的根目录下执行

cd /app/compile/hadoop-2.6.4-src/

mvn package -Pdist,native -DskipTests –Dtar

漫长的等待过后编译成功

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [ 11.888 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  5.357 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  7.229 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.324 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  4.416 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  7.496 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  6.621 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  7.012 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  4.873 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [03:20 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 27.053 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 18.815 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.546 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [08:35 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 59.205 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 23.495 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 13.669 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.240 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.295 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:59 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [01:09 min]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.533 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 31.774 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 57.158 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  8.371 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  9.341 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [01:13 min]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 16.459 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 12.375 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.122 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  3.845 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  2.000 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.112 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  9.664 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  8.961 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.317 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 39.935 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 36.232 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 11.957 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 13.317 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 13.429 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 10.005 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  2.039 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 10.499 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 13.785 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 10.537 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 52.092 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  5.572 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  9.371 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  6.367 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  4.211 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.762 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  3.921 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  4.534 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  9.314 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 14.693 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 17.670 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  6.587 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 22.836 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 14.913 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.276 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:31 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 29:36 min
[INFO] Finished at: 2017-01-11T02:04:55+08:00
[INFO] Final Memory: 101M/237M
[INFO] ------------------------------------------------------------------------

/app/compile/hadoop-2.6.4-src/hadoop-dist目录下生成了target目录,编译好的hadoop-2.6.4.tar.gz安装包就在这个目录下
Hadoop2.6.4版本64位编译_第1张图片

3.2验证编译是否成功

cd /app/compile/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native

[root@cyyun native]# file *
libhadoop.a:        current ar archive
libhadooppipes.a:   current ar archive
libhadoop.so:       symbolic link to `libhadoop.so.1.0.0'
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
libhadooputils.a:   current ar archive
libhdfs.a:          current ar archive
libhdfs.so:         symbolic link to `libhdfs.so.0.0.0'
libhdfs.so.0.0.0:   ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

目录中查看libhadoop.so.1.0.0属性,该文件为ELF 64-bit LSB则表示文件成功编译为64位

如果之前已经安装好没有编译过的Hadoop版本,将编译后的/app/compile/hadoop-2.6.4-src/hadoop-dist/target/hadoop-2.6.4/lib/native目录下的所有文件拷贝到Hadoop对应安装目录文件下即可。

[root@cyyun hadoop-2.6.4]# hadoop fs -ls /
17/01/11 03:52:30 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 9 items
-rw-r--r--   1 root supergroup         25 2016-12-20 16:44 /1.txt
-rw-r--r--   1 root supergroup       1366 2016-11-04 17:36 /README.txt
drwxr-xr-x   - root supergroup          0 2016-11-21 13:47 /flow
drwx------   - root supergroup          0 2016-11-21 11:53 /history
drwxr-xr-x   - root supergroup          0 2017-01-04 22:47 /input
drwxr-xr-x   - root supergroup          0 2017-01-04 23:00 /output
drwxr-xr-x   - root supergroup          0 2016-12-24 01:33 /test
drwxr-xr-x   - root supergroup          0 2017-01-09 16:50 /tmp
drwxr-xr-x   - root supergroup          0 2017-01-09 16:50 /user

拷贝替换后,警告消失

[root@cyyun native]# hadoop fs -ls /
Found 9 items
-rw-r--r--   1 root supergroup         25 2016-12-20 16:44 /1.txt
-rw-r--r--   1 root supergroup       1366 2016-11-04 17:36 /README.txt
drwxr-xr-x   - root supergroup          0 2016-11-21 13:47 /flow
drwx------   - root supergroup          0 2016-11-21 11:53 /history
drwxr-xr-x   - root supergroup          0 2017-01-04 22:47 /input
drwxr-xr-x   - root supergroup          0 2017-01-04 23:00 /output
drwxr-xr-x   - root supergroup          0 2016-12-24 01:33 /test
drwxr-xr-x   - root supergroup          0 2017-01-09 16:50 /tmp
drwxr-xr-x   - root supergroup          0 2017-01-09 16:50 /user

本文参考:
Hadoop2.X 64位编译
hadoop2.6.4 安装和编译
hadoop 2.6.4 编译
CentOS6.4编译Hadoop2.2.0

你可能感兴趣的:(Hadoop)