Spark 简单的wordcount

sc.textFile("/input/words.txt")

.flatMap(line => line.split(" "))

.map(word => (word, 1))

.reduceByKey((x, y) => x + y)

.collect

sc.textFile("/input/words.txt")

.flatMap(_.split(" "))

.map((_, 1))

.reduceByKey(_ + _)

.collect

你可能感兴趣的:(Spark 简单的wordcount)