【Spark 1.5.1】 安装

一、Hadoop 2.x 安装

Hadoop 2.x安装 http://my.oschina.net/u/204498/blog/519789

二、Spark1.5.1安装

1.下载spark1.5.1

http://spark.apache.org/downloads.html

选择spark的版本

[hadoop@hftclclw0001 ~]$ pwd
/home/hadoop

[hadoop@hftclclw0001 ~]$ wget 
 
[hadoop@hftclclw0001 ~]$ ll
total 480004
drwxr-xr-x 11 hadoop root      4096 Jan 17 04:54 hadoop-2.7.1
-rw-------  1 hadoop root 210606807 Jan 17 04:09 hadoop-2.7.1.tar.gz
drwxr-xr-x 13 hadoop root      4096 Jan 18 08:31 spark-1.5.1-bin-hadoop2.6
-rw-------  1 hadoop root 280901736 Jan 17 04:08 spark-1.5.1-bin-hadoop2.6.tgz

[hadoop@hftclclw0001 conf]$ pwd
/home/hadoop/spark-1.5.1-bin-hadoop2.6/conf

[hadoop@hftclclw0001 conf]$ cp slaves.template slaves -p 
[hadoop@hftclclw0001 conf]$ vi slaves            => slaves节点
hfspark0003.webex.com                    
hfspark0007.webex.com

[hadoop@hftclclw0001 conf]$ cp spark-env.sh.template spark-env.sh -p 
[hadoop@hftclclw0001 conf]$ vi spark-env.sh            => 配置spark环境变量
...
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/home/hadoop/spark-1.5.1-bin-hadoop2.6/lib/mysql-connector-java-5.1.25-bin.jar:/home/hadoop/spark-1.5.1-bin-hadoop2.6/lib/ojdbc6.jar        =>使用spark sql
...
export HADOOP_HOME=/home/hadoop/hadoop-2.7.1                    
export HADOOP_CONF_DIR=/home/hadoop/hadoop-2.7.1/etc/hadoop
export SPARK_MASTER_IP=hftclclw0001.webex.com
export SPARK_WORKER_MEMEORY=4g
export JAVA_HOME=/usr/java/

...

二、复制到其他的机器上

[hadoop@hftclclw0001 ~]$ pwd
/home/hadoop
 
[hadoop@hftclclw0001 ~]$ scp -r spark-1.5.1-bin-hadoop2.6 hadoop@{ip}:/home/hadoop

三、启动

[hadoop@hfspark0003 spark-1.5.1-bin-hadoop2.6]$ ./sbin/start-all.sh 
...
...

四、校验

a. jps ==> master,worker

b.webui ==> http://${ip}:8080

你可能感兴趣的:(【Spark 1.5.1】 安装)