1. 访问hadoop官网下载hadoop-2.7.1-src.tar.gz
tar -zxvf hadoop-2.7.1-src.tar.gz cd hadoop-2.7.1-src vi BUILDING.txt
Requirements: * Unix System * JDK 1.7+ * Maven 3.0 or later * Findbugs 1.3.9 (if running findbugs) * ProtocolBuffer 2.5.0 * CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac * Zlib devel (if compiling native code) * openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance ) * Jansson C XML parsing library ( if compiling libwebhdfs ) * Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs ) * Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2. 安装java1.8.0_60
下载jdk-8u60-linux-x64.tar.gz,解压后移动到/opt目录下
tar -zxvf jdk-8u60-linux-x64.tar.gz mv jdk1.8.0_60 /opt
然后打开/etc/profile配置jdk环境变量
vim /etc/profile
按 i 进入插入模式,在文件末尾添加
export JAVA_HOME=/opt/jdk1.8.0_60 export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/tools.jar export PATH=$PATH:$JAVA_HOME/bin export JRE_HOME=/opt/jdk1.8.0_60/jre export PATH=$PATH:$JRE_HOME/bin
先后按Esc, Shift+:, wq, 回车即可保存并退出编辑。<br>输入 source /etc/profile 回车即可保存更改。
运行javac -version 查看状态:
[root@hadoop1 opt]# java -version java version "1.8.0_60" Java(TM) SE Runtime Environment (build 1.8.0_60-b27) Java HotSpot(TM) 64-Bit Server VM (build 25.60-b23, mixed mode)
3. 安装相关类库
yum -y install svn ncurses-devel gcc* yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel
4. 安装protobuf-2.5.0.tar.gz(注意版本必须是2.5.0)
下载protobuf-2.5.0.tar.gz
tar zxvf protobuf-2.5.0.tar.gz
进入protobuf-2.5.0依次执行
cd protobuf-2.5.0 ./configure make make install
验证安装是否完成
[root@hadoop1 protobuf-2.5.0]# protoc --version libprotoc 2.5.0
5. 安装maven
下载apache-maven-3.2.2-bin.tar.gz
tar -zxvf apache-maven-3.2.2-bin.tar.gz mv apache-maven-3.2.2 /opt
配置环境变量:
vi /etc/profile
在文件尾部追加
export MAVEN_HOME=/opt/apache-maven-3.2.2 export MAVEN_OPTS="-Xms256m -Xmx512m" export PATH=$PATH:$MAVEN_HOME/bin
使/etc/profile生效
source /etc/profile
查看安装状态
mvn -version [root@hadoop1 ~]# mvn -version Apache Maven 3.2.2 (45f7c06d68e745d05611f7fd14efb6594181933e; 2014-06-17T21:51:42+08:00) Maven home: /opt/apache-maven-3.2.2 Java version: 1.8.0_60, vendor: Oracle Corporation Java home: /opt/jdk1.8.0_60/jre Default locale: zh_CN, platform encoding: GB18030 OS name: "linux", version: "3.10.0-229.14.1.el7.x86_64", arch: "amd64", family: "unix"
6. 安装ant
下载apache-ant-1.9.4-bin.tar.gz 后解压
tar -zxvf apache-ant-1.9.4-bin.tar.gz
移动到/opt目录下
mv apache-ant-1.9.4 /opt
配置环境变量
vi /etc/profile
在文件尾部追加
export ANT_HOME=/opt/apache-ant-1.9.4 export PATH=$PATH:$ANT_HOME/bin
使更改生效
source /etc/profile
查看安装结果
[root@hadoop1 ~]# ant -version Apache Ant(TM) version 1.9.4 compiled on April 29 2014
7. 安装findbugs
下载findbugs-3.0.1.tar.gz解压缩
tar -zxvf findbugs-3.0.1.tar.gz
移动到/opt目录下
mv findbugs-3.0.1 /opt
配置环境变量
[root@hadoop1 ~]# vi /etc/profile export FINDBUGS_HOME=/opt/findbugs-3.0.1 export PATH=$PATH:$FINDBUGS_HOME/bin
使更改生效
[root@hadoop1 ~]# source /etc/profile
查看安装结果
[root@hadoop1 ~]# findbugs -version 3.0.1
8. 编译hadoop-2.7.1-src
[root@hadoop1 ~]# cd hadoop-2.7.1-src [root@hadoop1 hadoop-2.7.1-src]# mvn clean package -Pdist,native -DskipTests -Dtar
编译结果
[INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [04:06 min] [INFO] Apache Hadoop Project POM .......................... SUCCESS [01:39 min] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 56.391 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.246 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 34.518 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [01:03 min] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [08:13 min] [INFO] Apache Hadoop Auth ................................. SUCCESS [04:18 min] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 31.298 s] [INFO] Apache Hadoop Common ............................... SUCCESS [05:39 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 10.503 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [01:08 min] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.036 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [04:41 min] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [01:40 min] [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [02:23 min] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 6.159 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.028 s] [INFO] hadoop-yarn ........................................ SUCCESS [ 0.092 s] [INFO] hadoop-yarn-api .................................... SUCCESS [ 46.580 s] [INFO] hadoop-yarn-common ................................. SUCCESS [03:15 min] [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.082 s] [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 15.964 s] [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 22.764 s] [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 3.983 s] [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 8.750 s] [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 25.165 s] [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 5.843 s] [INFO] hadoop-yarn-client ................................. SUCCESS [ 7.708 s] [INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 4.324 s] [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.084 s] [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 3.220 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.216 s] [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.040 s] [INFO] hadoop-yarn-registry ............................... SUCCESS [ 6.727 s] [INFO] hadoop-yarn-project ................................ SUCCESS [ 6.327 s] [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.033 s] [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 29.054 s] [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 20.872 s] [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 5.443 s] [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 11.221 s] [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 6.744 s] [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 37.598 s] [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.118 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 6.785 s] [INFO] hadoop-mapreduce ................................... SUCCESS [ 2.837 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 14.136 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 27.078 s] [INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.565 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 6.942 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 5.631 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 3.247 s] [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 2.675 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [ 3.461 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 10.270 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 5.841 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [02:23 min] [INFO] Apache Hadoop Azure support ........................ SUCCESS [ 23.729 s] [INFO] Apache Hadoop Client ............................... SUCCESS [ 7.584 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.093 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 6.164 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 11.370 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.020 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:41 min] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 53:39 min [INFO] Finished at: 2015-10-26T00:48:40+08:00 [INFO] Final Memory: 140M/494M
编译结束,成功!
9. 配置ssh无密码登录
CentOS默认没有启动ssh无密登录,去掉/etc/ssh/sshd_config其中2行的注释
vi /etc/ssh/sshd_config #去掉下面两行前面的注释 #RSAAuthentication yes #PubkeyAuthentication yes
)输入命令,ssh-keygen -t rsa,生成key,都不输入密码,一直回车,/root就会生成.ssh文件夹,每台服务器都要设置
ssh-keygen -t rsa
合并公钥到authorized_keys文件,在Master服务器,进入/root/.ssh目录,通过SSH命令合并
cd /root/.ssh cat id_rsa.pub>> authorized_keys
测试结果
[root@hadoop1 ~]# ssh localhost Last login: Mon Oct 26 01:12:35 2015 from localhost
10. 安装hadoop2.7.1