伪分布式-spark

  1. 解压安装包
tar -zxvf spark-2.4.3-bin-hadoop2.7.tgz
  1. 配置全局环境变量
export SPARK_HOME=/usr/local/src/spark
export PATH=$PATH:$JAVA_HOME/bin:/usr/local/src/hadoop/bin:$HBASE_HOME/bin:$SPARK_HOME/bin:$SPARK_HOME/sbin
  1. 在spark-env.sh添加java环境变量
vi spark-env.sh

在第一行添加以下内容

export JAVA_HOME=/usr/local/src/jdk/jdk1.8
export HADOOP_CONF_DIR=/usr/local/src/hadoop/etc/hadoop
export HADOOP_HDFS_HOME=/usr/local/src/hadoop
export SPARK_HOME=/usr/local/src/spark
export SPARK_DIST_CLASSPATH=$(/usr/local/src/hadoop/bin/hadoop classpath)
SPARK_MASTER_WEBUI_PORT=8079
  1. 启动命令
root@503ae25fe58d:/usr/local/src/spark/sbin# ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/src/spark/logs/spark-root-org.apache.spark.deploy.master.Master-1-503ae25fe58d.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/src/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-503ae25fe58d.out
  1. 验证Spark安装
root@503ae25fe58d:/usr/local/src/spark# bin/run-example SparkPi 2>&1 | grep "Pi is"
Pi is roughly 3.1394956974784876

你可能感兴趣的:(伪分布式-spark)