2019-12-04 spark简单代码测试

  1. 正常创建maven项目(不需要勾选 Create From Archetype)

  2. 安装scala 插件

image.png
  1. 删除项目的java目录,新建scala并设置源文件夹


    image.png

    4.添加scala SDK


    image.png
  2. 添加依赖




4.0.0

com.ganymede

sparkplatformstudy

1.0-SNAPSHOT



UTF-8

1.6.0

2.10

2.6.0







org.apache.spark

spark-core_${scala.version}

${spark.version}





org.apache.spark

spark-sql_${scala.version}

${spark.version}





org.apache.spark

spark-hive_${scala.version}

${spark.version}





org.apache.spark

spark-streaming_${scala.version}

${spark.version}





org.apache.hadoop

hadoop-client

2.6.0





org.apache.spark

spark-streaming-kafka_${scala.version}

${spark.version}





org.apache.spark

spark-mllib_${scala.version}

${spark.version}





mysql

mysql-connector-java

5.1.39





junit

junit

4.12











central

Maven Repository Switchboard

default

http://repo2.maven.org/maven2



false









src/main/scala

src/test/scala







org.apache.maven.plugins

maven-compiler-plugin

3.3



1.7

1.7

UTF-8











* 测试

object Hello {

def main(args: Array[String]): Unit = {

val conf = new SparkConf().setAppName("sessiontest").setMaster("local[*]")

val sparkSession = SparkSession.builder().config(conf).getOrCreate()

val rdd = sparkSession.sparkContext.parallelize(Array(1,2,3,5))

rdd.filter(ele=>ele%2==0).foreach(println(_))

}

}

https://blog.csdn.net/kwu_ganymede/article/details/51832427

你可能感兴趣的:(2019-12-04 spark简单代码测试)