首先,如果你是hadoop2.7.0+,那你的jdk必须是1.7以上。
我的搭建环境:
jdk1.7.0_80
hadoop-2.6.0
两台ubuntu(两台win7上的虚拟机安装) 注:用户名是shamrock 密码设为一样
172.16.134.13----master
172.16.134.12----slave
安装ssh
分别进入两台主机,进入shell命令,输入下面语句进行ssh下载安装(要能联网)
sudo apt-get install ssh安装好以后,在win7上用putty分别登录下两台主机,这样在shamrock用户下就会存在.ssh文件夹,或者自己新建一个也可以,查看是否有该文件夹
ls -a ~ 或 ls -a /home/shamrock设置免密码登录
1、cd ~/.ssh/ 2、ssh-keygen -t rsa 一直回车 3、ssh-add id_rsa 系统如果提示:Identity added: id_rsa (id_rsa) 就表明加载成功了 如果系统提示:could not open a connection to your authentication agent 则执行命令ssh-agent bash 4、在slave(172.16.134.12)上把公钥拷贝至master(172.16.134.13)上 scp id_rsa.pub [email protected]:~/.ssh/id_rsa.pub_slave 5、在master(172.16.134.13)上把自己的公钥id_rsa.pub和slave复制过来的公钥id_rsa.pub_slave追加到authorized_keys cat id_rsa.pub >> ~/.ssh/authorized_keys cat id_rsa.pub_slave >> ~/.ssh/authorized_keys 6、还是在master(172.16.134.13)上复制一份authorized_keys到slave(172.16.134.12)上 scp authorized_keys [email protected]:~/.ssh/authorized_keys 7、在两台主机上分别输入ssh localhost和分别输入对方的ip ssh 172.16.134.13 ssh 172.16.134.12 不用输入密码能直接进入下面相似内容则配置成功!!!
The authenticity of host '172.16.134.12 (172.16.134.12)' can't be established. ECDSA key fingerprint is ce:37:e0:78:9f:68:02:08:3a:05:6d:64:45:93:67:7a. Are you sure you want to continue connecting (yes/no)? yes Warning: Permanently added '172.16.134.12' (ECDSA) to the list of known hosts. Welcome to Ubuntu 14.04.2 LTS (GNU/Linux 3.16.0-30-generic i686) * Documentation: https://help.ubuntu.com/ 343 packages can be updated. 186 updates are security updates. Last login: Tue Jul 21 10:36:33 2015 from 172.16.134.14
本人是先在win7上下好了jdk安装包jdk-7u80-linux-i586.tar.gz、hadoop-2.6.0.tar.gz,用pscp命令上传到ubuntu系统上
pscp E:\hadoop\jdk-7u80-linux-i586.tar.gz [email protected]:/home/shamrock pscp E:\hadoop\jdk-7u80-linux-i586.tar.gz [email protected]:/home/shamrock pscp E:\hadoop\hadoop-2.6.0.tar.gz [email protected]:/home/shamrock pscp E:\hadoop\hadoop-2.6.0.tar.gz [email protected]:/home/shamrock
安装jdk1.7.0_80
解压jdk安装包
tar -xvf /home/shamrock/jdk-7u80-linux-i586.tar.gz在当前用户的文件夹下则会多出一个文件夹jdk1.7.0_80
配置jdk环境
vi /etc/profile(如果打开显示readonly,则关闭该文件,执行sudo chmod 777 /etc/profile 修改权限,再执行vi打开即可)
尾部添加如下代码
#set Java Environment export JAVA_HOME=/home/shamrock/jdk1.7.0_80 export CLASSPATH=".:$JAVA_HOME/lib:$CLASSPATH" export PATH="$JAVA_HOME/:$PATH"然后用source重新编译使之生效即可!
再输入下面几个命令执行
sudo update-alternatives --install /usr/bin/java java /home/shamrock/jdk1.7.0_80/bin/java 300 sudo update-alternatives --install /usr/bin/javac javac /home/shamrock/jdk1.7.0_80/bin/javac 300 sudo update-alternatives --config java
java version "1.7.0_80" Java(TM) SE Runtime Environment (build 1.7.0_80-b15) Java HotSpot(TM) Server VM (build 24.80-b11, mixed mode)
解压hadoop安装包
tar -xvf /home/shamrock/jdk-7u80-linux-i586.tar.gz
解压后在主文件夹/home/shamrock下出现文件夹hadoop-2.6.0
配置hadoop环境
vi /etc/profile
末尾添加
#set hadoop Environment export HADOOP_HOME=/home/shamrock/hadoop-2.6.0 export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"然后用source重新编译使之生效即可!
配置hadoop集群
vi修改/etc/hosts文件(同样如果是readonly则用前面提到的命令修改文件权限)
最底下添加
172.16.134.13 master 172.16.134.12 slave
vi修改/etc/hostname
最底下添加
master
进入hadoop-2.6.0/etc/hadoop/下面
配置hadoop-env.sh
最底下添加
export JAVA_HOME=/home/shamrock/jdk1.7.0_80
<configuration> <property> <name>fs.default.name</name> <value>hdfs://master:9000</value> </property> <property> <name>hadoop.native.lib</name> <value>true</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/tmp</value> </property> </configuration>
配置hdfs-site.xml
<configuration> <property> <name>dfs.replication</name> <value>2</value> </property> </configuration>
<configuration> <property> <name>mapred.job.tracker</name> <value>master:9001</value> </property> </configuration>
slave
在master上执行
进入hadoop2.6.10执行
bin/hadoop namenode -format
sbin/start-all.sh
如果上述执行有停顿,就回车
然后去浏览器上输入http://master:8088和http://master:50070能正常打开看运行情况。
到此一切结束。我也是折腾了一两天,可能上述有一些其他问题,如果被高手看到请指教。谢谢了!!!!