CentOS7搭建Spark-2.3集群

spark 2.3-hadoop-2.7

1.配置环境变量

export SPARK_HOME=/opt/spark
export PATH=$PATH:$SPARK/bin

2.修改spark-env.sh

cp /opt/spark-2.3-hadoop-2.7/conf/spark-env.sh.template /opt/spark-2.3-hadoop-2.7/conf/spark-env.sh

在末尾添加

export JAVA_HOME=/opt/jdk-1.8
export SCALA_HOME=/opt/scala-2.11
export HADOOP_HOME=/opt/hadoop-2.7
export HADOOP_CONF_DIR=/opt/hadoop-2.7/etc/hadoop
export SPARK_MASTER_IP=hserver-1
export SPARK_WORKER_MEMORY=1g
export SPARK_WORKER_CORES=1
export SPARK_WORKER_INSTANCES=1

3.修改slaves文件(添加Worker节点)

cp /opt/spark-2.3-hadoop-2.7/conf/slaves.template cp /opt/spark-2.3-hadoop-2.7/conf/slaves

hserver-2
hserver-3

4.把spark拷贝到另外两台机器

scp -r /opt/spark-2.3-hadoop-2.7 root@hserver-2://opt/
scp -r /opt/spark-2.3-hadoop-2.7 root@hserver-3://opt/

5.启动spark

/opt/spark-2.3-hadoop-2.7/sbin/start-all.sh

你可能感兴趣的:(CentOS7搭建Spark-2.3集群)