单机安装hadoop

安装的时候用的是二进制包,不是源包。这点要注意!

在http://hadoop.apache.org上下载源包。如:hadoop-1.0.2-bin.tar.gz

解压至根目录的hadoop根目录下。eg:hadoop/hadoop-1.0.2

设置Java环境变量:

在conf/hadoop-env.sh中找到:

# The java implementation to use. Required.
#export JAVA_HOME=/opt/jdk1.6.0_30

改为:

# The java implementation to use. Required.
export JAVA_HOME=/opt/jdk1.6.0_30

其实就是把#去掉,=后面写上自己的jdk安装路径就行了。

此时,就算安装完成了。验证一下:

1、进入bin文件夹下:执行./hadoop

出现:

Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
namenode -format format the DFS filesystem
secondarynamenode run the DFS secondary namenode
namenode run the DFS namenode
datanode run a DFS datanode
dfsadmin run a DFS admin client
mradmin run a Map-Reduce admin client
fsck run a DFS filesystem checking utility
fs run a generic filesystem user client
balancer run a cluster balancing utility
fetchdt fetch a delegation token from the NameNode
jobtracker run the MapReduce job Tracker node
pipes run a Pipes job
tasktracker run a MapReduce task Tracker node
historyserver run job history servers as a standalone daemon
job manipulate MapReduce jobs
queue get information regarding JobQueues
version print the version
jar <jar> run a jar file
distcp <srcurl> <desturl> copy file or directories recursively
archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
classpath prints the class path needed to get the
Hadoop jar and the required libraries
daemonlog get/set the log level for each daemon
or
CLASSNAME run the class named CLASSNAME
Most commands print help when invoked w/o parameters.
就说明安装成功了。

2、如果想直接使用hadoop命令。还得配置一下:

export HADOOP_INSTALL=/home/baobei/Hadoop/hadoop-1.0.2

export PATH=$PATH:$HADOOP_INSTALL/bin

然后执行:hadoop version

若出现:

Hadoop 1.0.2
Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0.2 -r 1304954
Compiled by hortonfo on Sat Mar 24 23:58:21 UTC 2012
From source with checksum c198b04303cfa626a38e13154d2765a9

则配置成功了。


你可能感兴趣的:(hadoop)