spark on yarn 安装部署

准备

  • 下载spark,地址:http://spark.apache.org/downloads.html
  • 下载不带hadoop预编译环境的spark最新版本,好处是可以自由使用最新版本的hadoop
  • 下载hadoop,地址:https://hadoop.apache.org/releases.html

1.基本环境配置

[ec2-user@rcf-ai-datafeed-spark-prd-01 conf]$ cat /etc/hosts

127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost6 localhost6.localdomain6
10.16.5.162 rcf-ai-datafeed-spark-prd-01.wisers.com rcf-ai-datafeed-spark-prd-01     #master
10.16.5.177 rcf-ai-datafeed-spark-prd-02.wisers.com rcf-ai-datafeed-spark-prd-02     #slave
10.16.5.22 rcf-ai-datafeed-spark-prd-03.wisers.com rcf-ai-datafeed-spark-prd-03       #slave
10.16.5.243 rcf-ai-datafeed-spark-prd-04.wisers.com rcf-ai-datafeed-spark-prd-04     #slave

 

[ec2-user@rcf-ai-datafeed-spark-prd-01 conf]$ cat /etc/profile

复制代码
#java config
export JAVA_HOME=/data/server/jdk

#hadoop config
export HADOOP_HOME=/data/server/hadoop
export HADOOP_CONF_DIR= H A D O O P H O M E / e t c / < s p a n s t y l e = " c o l o r : r g b a ( 0 , 0 , 0 , 1 ) " > h a d o o p e x p o r t Y A R N H O M E < / s p a n > = < s p a n s t y l e = " c o l o r : r g b a ( 0 , 0 , 0 , 1 ) " > HADOOP_HOME/etc/hadoop export YARN_HOME= HADOOPHOME/etc/<

你可能感兴趣的:(spark)