Hadoop-2.6.0 上配置 spark

在安装了hadoop之后

一,安装Scala 
1,到http://www.scala-lang.org/download/ 下载与Spark版本对应的Scala。Spark1.2对应于Scala2.10的版本。这里下载scala-2.10.4.tgz。 
2,解压安装Scala 
1), 执行#tar -axvf scala-2.10.4.tgz,解压到/root/spark/scala-2.10.4。 
2),在~/.bash_profile中添加如下配置:

export SCALA_HOME=/root/spark/scala-2.10.4
export PATH=$JAVA_HOME/bin$HADOOP_HOME/bin:$HIVE_HOME/bin:$SCALA_HOME/bin:$PATH


3),使环境变量生效,#source ~/.bash_profile 
3,验证安装,在命令行中输入scala命令,可以进入scala命令控制台。

# scala
Welcome to Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_45).
Type in expressions to have them evaluated.
Type :help for more information.

scala> 


三,安装Spark 
1,到http://spark.apache.org/downloads.html下载spark-1.2.0-bin-hadoop2.4.tgz,解压到/root/spark/spark-1.2.0-bin-hadoop2.4。 
2,在.bash_profile中添加如下配置:

export SPARK_HOME=/root/spark/spark-1.2.0-bin-hadoop2.4
export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin:$HIVE_HOME/bin:$PATH


3,使环境变量生效,#source ~/.bash_profile

四,配置Spark 
1,进入Spark的配置文件路径,#cd $SPARK_HOME/conf 
2,执行,#cp spark-env.sh.template spark-env.sh 
3,在spark-env.sh文件中添加如下配置:

export JAVA_HOME=/usr/lib/jdk1.6.0_45
export SCALA_HOME=/root/spark/scala-2.10.4
export HADOOP_CONF_DIR=/root/hadoop/hadoop-2.6.0/etc/hadoop


五,启动Spark 
1,进入spark的安装路径,#cd /root/spark/spark-1.2.0-bin-hadoop2.4 
2,执行#./sbin/start-all.sh命令 
3,执行 #jps命令,会有Master和Worker进程

# jps
54679 NameNode
26587 Jps
54774 DataNode
9850 Worker
9664 Master
55214 NodeManager
55118 ResourceManager
54965 SecondaryNameNode


web启动spark ,输入:IP:8080

你可能感兴趣的:(spark)