一、检测JDK版本
1、java -version
k-MacBook-Pro:~ $ java -version
java version "1.8.0_60"
Java(TM) SE Runtime Environment (build 1.8.0_60-b27)
Java HotSpot(TM) 64-Bit Server VM (build 25.60-b23, mixed mode)
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_60.jdk/Contents/Home
二、下载Hadoop:http://hadoop.apache.org/releases.html
1、下载好hadoop后,解压到任意工程目录
export HADOOP_HOME=/Users/k/hadoop-2.7.2
2、进入hadoop配置目录
/Users/k/hadoop-2.7.2/etc/hadoop
3、vim hadoop-env.sh(配置hadoop环境)
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_60.jdk/Contents/Home
export HADOOP_HEAPSIZE=2000
export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"
4、vim core-site.xml(配置NameNode主机名与端口)
5、vim hdfs-site.xml(配置HDFS的默认参数副本数)
6、vim mapred-site.xml(配置JobTracker主机名与端口)
7、vim yarn-site.xml
8、安装HDFS并格式化namenode
k-MacBook-Pro:hadoop k$ hdfs namenode -format
16/07/03 01:35:54 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
9、启动Hadoop
k-MacBook-Pro:hadoop k$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
16/07/03 01:31:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: namenode running as process 1325. Stop it first.
localhost: starting datanode, logging to /Users/kirogi/hadoop-2.7.2/logs/hadoop-kirogi-datanode-kirogis-MacBook-Pro.local.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: secondarynamenode running as process 882. Stop it first.
16/07/03 01:31:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
starting yarn daemons
resourcemanager running as process 994. Stop it first.
localhost: nodemanager running as process 1077. Stop it first.
10、验证hadoop
k-MacBook-Pro:hadoop k$ jps
882 SecondaryNameNode
994 ResourceManager
1077 NodeManager
4312 Jps
1325 NameNode
或者
打开http://localhost:50070,进入hdfs管理页面
打开http://localhost:8088,进入hadoop进程管理页面