首先解压下载来的hadoop 0.20包到/home/admin目录:
tar xzfhadoop-0.20.2.tar.gz
配置Hadoop环境变量:
exportHADOOP_INSTALL=/home/admin/hadoop-0.20.2
exportPATH=$PATH:$HADOOP_INSTALL/bin
测试下是否安装成功:
hadoop version
为当前用户配置无密码的SSH登录:
ssh-keygen-t rsa -P '' -f ~/.ssh/id_rsa
cat~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
测试一下是否还提示输入密码:
ssh localhost
/home/admin/hadoop-0.20.2/conf/core-site.xml
===============================================================================
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost</value>
</property>
</configuration>
/home/admin/hadoop-0.20.2/conf/hdfs-site.xml
===============================================================================
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
/home/admin/hadoop-0.20.2/conf/mapred-site.xml
===============================================================================
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:8021</value>
</property>
</configuration>
hadoop namenode -format
start-dfs.sh
start-mapred.sh
在namenode启动脚本%HADOOP_HOME%/bin/start-dfs.sh的时候发现datanode报错:
Error: JAVA_HOMEis not set
原因是在%HADOOP_HOME%/conf/hadoop-env.sh内缺少JAVA_HOME的定义,只需要在hadoop-env.sh中增加:
exportJAVA_HOME=/export/servers/jdk1.6.0_25/
hadoop fs -mkdir books
hadoop fs -ls .
hadoop fs -copyFromLocal NOTICE.txthdfs://localhost/user/root/books/NOTICE.txt
《Hadoop权威指南》