《日子》 ApacheBeam 初试WordCount

对机器学习会有很大的帮助,模型简单,易用

Apache Beam 的两大特点

1、将数据的批处理(batch)和流处理(stream)编程范式进行了统一;

2、能够在任何的执行引擎上运行。

它不仅为模型设计、更为执行一系列数据导向的工作流提供了统一的模型。这些工作流包括数据处理、吸收和整合。

新建maven项目

《日子》 ApacheBeam 初试WordCount_第1张图片
Paste_Image.png

pom.xml加入依赖




org.apache.beam
beam-sdks-java-core
0.4.0


org.apache.beam
beam-runners-direct-java
0.4.0


测试类WordCount.java


package org.tom;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.io.TextIO;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.transforms.*;
import org.apache.beam.sdk.values.KV;
import org.apache.beam.sdk.values.PCollection;
import java.io.Serializable;
public class WordCount implements Serializable{
private transient Pipeline pipeline = null;
public WordCount() {
PipelineOptions options = PipelineOptionsFactory.create();
options.setJobName("wordcount");
pipeline = Pipeline.create(options);
}
public void transform() {
PCollection collection = pipeline.apply(TextIO.Read.from("file:///d:/tom/beam-test/src/main/resources/word.txt"));
PCollection extractWords = collection.apply("ExtractWords", ParDo.of(new DoFn() {
@ProcessElement
public void processElement(ProcessContext c) {
String[] split = c.element().split(" ");
for (String word : split) {
if (!word.isEmpty()) {
c.output(word);
}
}
}
}));
PCollection> pCollection = extractWords.apply(Count.perElement());
PCollection formatResults = pCollection.apply("FormatResults", MapElements.via(new SimpleFunction, String>() {
public String apply(KV input) {
return input.getKey() + ": " + input.getValue();
}
}));
formatResults.apply(TextIO.Write.to("D:\tom\beam-test\src\main\resources\wordcounts"));
}
public void run(){
pipeline.run().waitUntilFinish();
}
public static void main(String[] args) {
WordCount wordCount = new WordCount();
wordCount.transform();
wordCount.run();
}
}

统计文本\resources\word.txt


tom
hello
tom
luo
hello
tom
tom
word
word
word
tom

运行结果

Paste_Image.png

word: 3
luo: 1
tom: 5
hello: 2
结果生成了两个文件,是由于hash分区了

你可能感兴趣的:(《日子》 ApacheBeam 初试WordCount)