centos 下 hadoop 3 单机安装配置

参考: www.cnblogs.com/forbeat/p/8179877.html


修改主机名:
    1、在/etc/hosts 增加 192.168.56.102 hadoopmaster
    2、在/etc/sysconfig/network 修改 HOSTNAME=hadoopmaster

重启centos:reboot

创建用户:useradd hadoop
创建用户密码:passwd hadoop 然后根据提示修改密码

创建目录:mkdir /home/hadoop/program
修改权限:chown -R hadoop:hadoop /home/hadoop

配置免密登陆:
ssh-keygen -t rsa 
cd ~/.ssh
cp id_rsa.pub authorized_keys
测试免密登陆:ssh localhost,不用输入密码为成功



1、jdk安装配置:

下载:wget http://download.oracle.com/otn-pub/java/jdk/8u151-b12/e758a0de34e24606bca991d704f6dcbf/jdk-8u151-linux-x64.tar.gz

解压至/home/hadoop/program:tar -zxvf jdk-8u151-linux-x64.tar.gz

配置环境变量(root下):在/etc/profile 末尾加上:
#set JAVA_HOME
export JAVA_HOME=/home/hadoop/program/jdk1.8.0_151   ## 这里要注意目录要换成自己解压的jdk 目录
export JRE_HOME=${JAVA_HOME}/jre  
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib  
export PATH=${JAVA_HOME}/bin:$PATH
然后让它生效:source /etc/profile
验证:java -version

2、hadoop安装配置:

下载:wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-3.0.0/hadoop-3.0.0.tar.gz
解压至/home/hadoop/program:  tar -zxvf hadoop-3.0.0
配置环境变量:在/etc/profile 增加
# set HADOOP_HOME
export HADOOP_HOME=/home/hadoop/program/hadoop-3.0.0
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export  HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export  HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib
然后让它生效:source /etc/profile
配置hadoop相关文件:
进入 /home/hadoop/program/hadoop-3.0.0/sbin

hadoop-env.sh 导入JAVA_HOME:

    export JAVA_HOME=/home/hadoop/program/jdk1.8.0_151   ## 这里要注意目录要换成自己解压的jdk 目录

core-site.xml,配置hdfs端口和地址,临时文件存放地址

    
        fs.default.name
        hdfs://hadoopmaster:9000
    
    
        hadoop.tmp.dir
        /home/hadoop/hadoop/tmp
修改hdfs-site.xml 配置副本个数以及数据存放的路径

    
        dfs.replication
        1
    
    
        dfs.namenode.name.dir
        /home/hadoop/hadoop/hdfs/name
    
    
        dfs.namenode.data.dir
        /home/hadoop/hadoop/hdfs/data
    
修改mapred-site.xml,配置使用yarn框架执行mapreduce处理程序,与之前版本多了后面两部
不配置mapreduce.application.classpath这个参数mapreduce运行时会报错:
Error: Could not
find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster

    
        mapreduce.framework.name
        yarn
    
    
        mapreduce.application.classpath
        
            /usr/local/hadoop3/etc/hadoop,
            /usr/local/hadoop3/share/hadoop/common/*,
            /usr/local/hadoop3/share/hadoop/common/lib/*,
            /usr/local/hadoop3/share/hadoop/hdfs/*,
            /usr/local/hadoop3/share/hadoop/hdfs/lib/*,
            /usr/local/hadoop3/share/hadoop/mapreduce/*,
            /usr/local/hadoop3/share/hadoop/mapreduce/lib/*,
            /usr/local/hadoop3/share/hadoop/yarn/*,
            /usr/local/hadoop3/share/hadoop/yarn/lib/*
        
    

修改yar-site.xml



    
        yarn.resourcemanager.hostname
        hadoopmaster
    
    
        yarn.nodemanager.aux-services
        mapreduce_shuffle
    

workers文件里添加主机名:hadoopmaster


验证HADOOP:
1、namenode格式化:hadoop namenode -format
2、启动Hadoop:start-all.sh
3、输入jps ,显示NameNode、DataNode、SecondaryNameNode、NodeManager、ResourceManager
4、若有缺漏,查看对应日志: /home/hadoop/program/hadoop-3.0.0/logs目录下

你可能感兴趣的:(centos 下 hadoop 3 单机安装配置)