一、准备工作(scala & spark)
1.1 centOS6服务器3台
master 192.168.3.140 (master.hadoop.zjportdns.gov.cn)
node1 192.168.3.141 (node1.hadoop.zjportdns.gov.cn)
node2 192.168.3.142 (node2.hadoop.zjportdns.gov.cn)
1.2 下载安装包
scala-2.11.8.tgz
spark-2.1.0-bin-hadoop2.7.tgz
上传到三台服务器/usr/local/目录下
二、安装(三台机器执行相同操作)
2.1解压scala
cd /usr/local tar -xvf scala-2.11.8.tgz
2.2设置环境变量
echo -e "export SCALA_HOME=/usr/local/scala-2.11.8" >> /etc/profile echo -e "export PATH=$PATH:$SCALA_HOME/bin" >> /etc/profile
2.3解压spark
tar -xvf spark-2.1.0-bin-hadoop2.7.tgz
2.4设置环境变量
echo -e "export SPARK_HOME=/usr/local/spark-2.1.0-bin-hadoop2.7" >> /etc/profile echo -e "export PATH=$PATH:$SPARK_HOME/bin" >> /etc/profile
2.5修改命令
cd spark-2.1.0-bin-hadoop2.7/conf cp slaves.template slaves echo -e "node1.hadoop.zjportdns.gov.cn\nnode2.hadoop.zjportdns.gov.cn" > slaves cp spark-env.sh.template spark-env.sh echo -e "export SCALA_HOME=/usr/local/scala-2.11.8" >> spark-env.sh echo -e "export JAVA_HOME=/usr/local/jdk1.7.0_79" >> spark-env.sh source /etc/profile
三、运行
3.1启动
$SPARK_HOME/sbin/start-all.sh
3.2验证
http://master.hadoop.zjportdns.gov.cn:8080/

3.3停止
$SPARK_HOME/sbin/stop-all.sh