Centos搭建hadoop平台

1.设置固定IP地址及网关
设置IP:如果虚拟机上不了网需要设置,否则不用设置

vi /etc/sysconfig/network-scripts/ifcfg-eth0
修改如下内容:
DEVICE=eth0
HWADDR=08:00:27:BD:9D:B5  #不用改
TYPE=Ethernet
UUID=53e4e4b6-9724-43ab-9da7-68792e611031 #不用改
ONBOOT=yes  #开机启动
NM_CONTROLLED=yes

BOOTPROTO=static  #静态IP
IPADDR=本机ip  #IP地址
NETMASK=255.255.255.0 #子网掩码

设置网关:

vi /etc/sysconfig/network
修改如下内容:
NETWORKING=yes
HOSTNAME=Hadoop.Master
GATEWAY=虚拟机网关 #网关

设置DNS

vi /etc/resolv.conf 
修改如下内容:
nameserver xxx.xxx.xxx.xxx #根据实际情况设置
nameserver 114.114.114.114 #可以设置多个

重启网卡

service network restart

测试网络

ping www.baidu.com 

ping不通一般是DNS问题
设置主机名对应IP地址

vi /etc/hosts

#添加如下内容

本机ip Hadoop.Master

2.添加Hadoop用户
添加用户组

groupadd hadoop

添加用户并分配用户组

useradd -g hadoop hadoop

修改用户密码

passwd hadoop

3.关闭服务
关闭防火墙

[root@hadoop ~]# systemctl stop firewalld关闭防火墙服务
[root@hadoop ~]# systemctl disable firewalld.service关闭防火墙开机启动
service ip6tables stop
chkconfig ip6tables off

关闭SELinux

vi /etc/sysconfig/selinux

#修改如下内容

SELINUX=enforcing -> SELINUX=disabled

#再执行如下命令

setenforce 0
getenforce

4.SSH无密码配置
查看ssh与rsync安装状态

rpm -qa|grep openssh
rpm -qa|grep rsync

安装ssh与rsync

yum -y install ssh
yum -y install rsync

切换hadoop用户

su – hadoop

生成ssh密码对

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa

将id_dsa.pub追加到授权的key中

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

设置授权key权限

chmod 600 ~/.ssh/authorized_keys

#权限的设置非常重要,因为不安全的设置安全设置,会让你不能使用RSA功能

测试ssh连接

ssh localhost

#如果不需要输入密码,则是成功

5.安装Java
切换至root用户

su -

创建/usr/java文件夹

mkdir /usr/java

使用FTP工具上传至服务器
将压缩包上传至/home/hadoop目录
注:我这里使用的是FlashFXP,使用hadoop用户连接

将压缩包解压至/usr/java 目录

tar zxvf /home/hadoop/jdk-7u80-linux-i586.tar.gz -C /usr/java/

设置环境变量

vi /etc/profile

#追加如下内容

export JAVA_HOME=/usr/java/jdk1-12.0.1
export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib
export PATH=$PATH:$JAVA_HOME/bin

使环境变量生效

source /etc/profile

测试环境变量设置

java -version

6.Hadoop安装与配置
安装Hadoop
使用FTP工具上传至服务器
将压缩包上传至/home/hadoop目录

将压缩包解压至/usr目录

tar zxvf /home/hadoop/hadoop-2.7.1.tar.gz -C /usr/

修改文件夹名称

mv /usr/hadoop-2.7.1/ /usr/hadoop

创建hadoop数据目录

mkdir /usr/hadoop/tmp

将hadoop文件夹授权给hadoop用户

chown -R hadoop:hadoop /usr/hadoop/

设置环境变量

vi /etc/profile

#追加如下内容

export HADOOP_HOME=/usr/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

使环境变量生效

source /etc/profile

测试环境变量设置

hadoop version

配置HDFS
切换至Hadoop用户

su - hadoop

修改hadoop-env.sh

cd /usr/hadoop/etc/hadoop/
vi hadoop-env.sh 

#追加如下内容

export JAVA_HOME=/usr/java/jdk1.7.0_80

修改core-site.xml

vi core-site.xml 

#添加如下内容


    
        fs.defaultFS
        hdfs://Hadoop.Master:9000
    
    
        hadoop.tmp.dir
        /usr/hadoop/tmp/
        A base for other temporary directories.
    

修改hdfs-site.xml

vi hdfs-site.xml

#添加如下内容


    
        dfs.replication
        1
    

格式化hdfs

hdfs namenode -format

注:出现Exiting with status 0即为成功

启动hdfs

start-dfs.sh

#停止命令 stop-dfs.sh
注:输出类似如下内容

15/09/21 18:09:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Starting namenodes on [Hadoop.Master]
Hadoop.Master: starting namenode, logging to /usr/hadoop/logs/hadoop-hadoop-namenode-Hadoop.Master.out
Hadoop.Master: starting datanode, logging to /usr/hadoop/logs/hadoop-hadoop-datanode-Hadoop.Master.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host ‘0.0.0.0 (0.0.0.0)’ can’t be established.
RSA key fingerprint is b5:96:b2:68:e6:63:1a:3c:7d:08:67:4b:ae:80:e2:e3.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added ‘0.0.0.0’ (RSA) to the list of known hosts.
0.0.0.0: starting secondarynamenode, logging to /usr/hadoop/logs/hadoop-hadoop-secondarynamenode-Hadoop.Master.out
15/09/21 18:09:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicab

查看进程

Jps
注:输出类似如下内容

使用web查看Hadoop运行状态

http://你的服务器ip地址:50070/

你可能感兴趣的:(大数据开发,Centos搭建hadoop)