spark wordcount 第一个spark 程序

 

 

 

wordcount

package com.baoy.worldcount

import org.apache.spark.{SparkConf, SparkContext}

/**
  * Created by cmcc-B100036 on 2016/4/1.
  */
object WordCount {
  def main(args: Array[String]) {
    if(args.length == 0) {
      println("usage: wordcount <file>")
      System.exit(1)
    }
    val conf = new SparkConf ().setAppName("wordcount")
    val sc = new SparkContext(conf)
    sc.textFile(args(0))
      .flatMap(_.split(" "))
      .map(x => (x, 1))
      .reduceByKey(_ + _)
      .foreach(println)
    sc.stop()
  }
}

 

 

pom

 <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.4.1</version>
      <scope>provided</scope>
    </dependency>

 

 

准备原始数据



 

在 /home/cloudera/baoyou/data/log

创建文件 wordcount.log   

 

 在 hdfs 上创建 /data 路径

hdfs dfs -mkdir /data

 

 

上传 wordcount.log 到data路径

hdfs dfs -put wordcount.log /data/

 

 

 

 

运行 本地 spark-submit

 spark-submit  --class com.baoy.worldcount.WordCount --master local  /home/cloudera/baoyou/project/SparkDemo.jar /data/wordcount.log

 

 

运行结果:
spark wordcount 第一个spark 程序_第1张图片
 

 

 

 

 

 

你可能感兴趣的:(spark,wordcount,程序,第一个)