spark shell提交

spark-shell(REPL)

(1.)直接运行spark-shell启动的是本地的

命令:

[root@bigdata111 ~]#spark-shell


Spark context available as 'sc' (master = local[*], app id = local-1577740473039).

scala> sc.textFile("/opt/module/test/one").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect
res0: Array[(String, Int)] = Array((is,2), (you,1), (plus,1), (name,2), (hadoop,1), (hi,1), (jh,1), (do,1), (HP,1), (hello,1), (java,2), (my,2))

(2.)运行spark-shell --master spark://bigdata111:7077 启动集群模式的spark-shell

命令:

[root@bigdata111 ~]#spark-shell --master spark://bigdata111:7077


Spark context available as 'sc' (master = spark://bigdata111:7077, app id = app-20191231051535-0002)

scala> sc.textFile("hdfs://bigdata111:9000/one").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect
res0: Array[(String, Int)] = Array((is,2), (plus,1), (jh,1), (HP,1), (hello,1), (java,2), (my,2), (you,1), (name,2), (hadoop,1), (hi,1), (do,1))






scala> sc.textFile("hdfs://bigdata111:9000/one").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).saveAsTextFile("hdfs://bigdata111:9000/oneOut")

wordcount单步操作:注意数据类型是RDD
scala> val rdd1 = sc.textFile("hdfs://bigdata111:9000/one")
rdd1: org.apache.spark.rdd.RDD[String] = hdfs://bigdata111:9000/one MapPartitionsRDD[18] at textFile at :24

scala> rdd1.collect
res4: Array[String] = Array(my name is plus, do you, my name is jh, hello, hi, HP hadoop java java)

scala> val rdd2 = rdd1.flatMap(_.split(" "))
rdd2: org.apache.spark.rdd.RDD[String] = MapPartitionsRDD[19] at flatMap at :26

scala> val rdd3 = rdd2.map((_,1))
rdd3: org.apache.spark.rdd.RDD[(String, Int)] = MapPartitionsRDD[20] at map at :28

scala> val rdd4 = rdd3.reduceByKey(_+_)
rdd4: org.apache.spark.rdd.RDD[(String, Int)] = ShuffledRDD[21] at reduceByKey at :30

scala> rdd4.collect
res5: Array[(String, Int)] = Array((is,2), (plus,1), (jh,1), (HP,1), (hello,1), (java,2), (my,2), (you,1), (name,2), (hadoop,1), (hi,1), (do,1))

 

 

                                                                                                                                    ————保持饥饿,保持学习

                                                                                                                                                     Jackson_MVP

你可能感兴趣的:(Spark)