mac 下安装spark

(1) 安装jdk

https://www.oracle.com/technetwork/cn/java/javase/downloads/jdk8-downloads-2133151-zhs.html

目前是1.8版本,8u181, 181 代表第181次更新

(2)安装scala

scala 下载:http://www.scala-lang.org/files/archive/scala-2.13.0.tgz

mac 在特定目录,新建“Scala” 文件夹:command+shift+G,输入  /Users/gcy/

scala 下载:https://www.apache.org/dyn/closer.lua/spark/spark-2.3.0/spark-2.3.0-bin-hadoop2.7.tgz

将文件移到特定目录解压: cd /Users/gcy/Scala     >tar -xzf scala-2.13.0.tgz

打开配置全局环境变量脚本:sudo vim /etc/profile

添加环境变量:SCALA_HOME="/Users/gcy/Scala/scala-2.13.0"

export PATH="$PATH:/Users/gcy/Scala/scala-2.13.0/bin"

SHELL退出:按esc    输入:wq! 

更新脚本:source /etc/profile

命令行输入:scala    检测是否成功

scala 命令行退出::quit

(2)安装spark

spark 下载:https://www.apache.org/dyn/closer.lua/spark/spark-2.3.0/spark-2.3.0-bin-hadoop2.7.tgz

将文件移到特定目录解压: cd /Users/gcy/Spark    >tar -xzf spark-2.3.3-bin-hadoop2.7

打开配置全局环境变量脚本:sudo vim /etc/profile

添加环境变量:SPARK_HOME="/Users/gcy/Spark/spark-2.3.3-bin-hadoop2.7"

                         export PATH=${PATH}:${SPARK_HOME}/bin

SHELL退出:按esc    输入:wq! 

更新脚本:source /etc/profile

命令行输入:spark-shell,检测spark 是否安装成功

你可能感兴趣的:(mac 下安装spark)