Spark2.4.4安装配置

spark2.4.4分布式 高可用HA配置

  • conf/slaves配置
node1
node2
node3
node4
  • conf/spark-env.sh配置
# 配置JDK安装位置
JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.222.b10-1.el7_7.x86_64
# 配置hadoop配置文件的位置
HADOOP_CONF_DIR=/usr/local/lib/hadoop-3.2.1/etc/hadoop
# 配置zookeeper地址
SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=master:2181,node1:2181,node2:2181,node3:2181,node4:2181 -Dspark.deploy.zookeeper.dir=/usr/local/lib/spark-2.4.4"
SPARK_DIST_CLASSPATH=$(/usr/local/lib/hadoop-3.2.1/bin/hadoop classpath)

spark2.4.4指令

# master启动
sbin/start-all.sh
#node1、node2上启动备用Master
sbin/start-master.sh

你可能感兴趣的:(Spark2.4.4安装配置)