基于eclipse的spark开发环境搭建

1、下载spark安装包,然后直接解压即可。


2、配置环境变量

     SPARK_HOME  F:\spark-2.1.0-bin-hadoop2.6

     Path追加F:\spark-2.1.0-bin-hadoop2.6\bin;

3、创建maven工程。加入如下配置

      
2.1.1
2.11
UTF-8

     

org.apache.hadoop
hadoop-common
2.8.0


org.apache.hadoop
hadoop-hdfs
2.8.0


org.apache.hadoop
hadoop-client
2.8.0






jdk.tools
jdk.tools
1.8
system
${JAVA_HOME}/lib/tools.jar




org.apache.hbase
hbase-client
1.3.1




org.apache.hbase
hbase-common
1.3.1




org.apache.hbase
hbase-it
1.1.3




org.apache.hbase
hbase-protocol
1.3.1




org.apache.hbase
hbase-server
1.3.1




org.apache.spark
spark-core_${scala.version}
${spark.version}


org.apache.spark
spark-streaming_${scala.version}
${spark.version}


org.apache.spark
spark-sql_${scala.version}
${spark.version}


org.apache.spark
spark-hive_${scala.version}
${spark.version}


org.apache.spark
spark-mllib_${scala.version}
${spark.version}





     

你可能感兴趣的:(spark,Mllib)