1.配置java1.6环境
设置/etc/apt/sources.list,在末尾添加deb http://archive.canonical.com/ lucid partner
sudo apt-get udpate ;更新源
sudo apt-get install sun-java6-jdk ;安装java1.6,并替换现有java环境
sudo update-java-alternatives -s java-6-sun ;运行这行出错,但是没有影响后面的配置
2.配置ssh,并设置RSA通信密钥
su - hadoop 换至hadoop用户
ssh-keygen -t rsa -P ""
cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
3.下载配置hadoop
下载hadoop0.20 http://labs.renren.com/apache-mirror/hadoop/core/
并解压为hadoop,放在/home/panbook/workspace/下
sudo chown -R panbook:panbook hadoop ;为hadoop文件夹设置用户组(用户panbook下的panbook组)
将hadoop/conf/hadoop-env.sh中的
export JAVA_HOME=/usr/lib/j2sdk1.5-sun
设置为本地JAVA目录,
export JAVA_HOME=/usr/lib/jvm/java-6-sun
在 hadoop/conf/core-site.xml中添加:
<property>
<name>hadoop.tmp.dir</name>
<value>/home/panbook/workspace/hadoop/hadoop-datastore/hadoop-hadoop</value>
<description>/home/panbook/workspace/hadoop/hadoop-datastore/为我创建的hadoop临时目录,hadoop-hadoop为hadoop-使用者的用户名</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
<description>默认文件系统</description>
</property>
在 hadoop/conf/mapred-site.xml中添加:
<property>
<name>mapred.job.tracker</name>
<value>localhost:54311</value>
<description></description>
</property>
在 hadoop/conf/hdfs-site.xml中添加:
<property>
<name>dfs.replication</name>
<value>1</value>
<description></description>
</property>
5.测试
可以参考文件File:Hadoop-eclipse.pdf 对hadoop的安装进行测 试