flink的入门案例之wordcount

这是依赖

  org.apache.flink

  flink-java

  1.5.0

  org.apache.flink

  flink-streaming-java_2.11

  1.5.0

  org.apache.flink

  flink-clients_2.11

  1.5.0

 这个是1.5版本的依赖..

 

然后是我们的代码

 

public class FlinkWordCount {

         publicstatic void main(String[] args) throws Exception {

       final ExecutionEnvironment env =ExecutionEnvironment.getExecutionEnvironment();

 

       DataSet text = env.fromElements(

           "hadoop hive?",

           "think hadoop hive sqoop hbase spark flink?");

 

        DataSet> wordCounts = text

           .flatMap(new LineSplitter())

           .groupBy(0)

           .sum(1);

 

       wordCounts.print();

    }

 

   public static class LineSplitter implements FlatMapFunction> {

       @Override

       public void flatMap(String line, Collector> out) {

           for (String word : line.split("\\W+")) {

                out.collect(newTuple2(word, 1));

           }

       }

}

 flink的入门案例之wordcount_第1张图片

 


 

 

执行结果,,可以在节点根据命令提交,

也可以在UI界面提交jar.的包,,在submit的需要指定程序的主类

 flink的入门案例之wordcount_第2张图片

 

你可能感兴趣的:(flink)