Apache Flink DataStream API之Transformation(章节三)

作者:jiangzz 电话:15652034180 微信:jiangzz_wx 微信公众账号:jiangzz_wy

DataStream 数据转换

常规操作符

// order zhangsan TV,GAME
val env = StreamExecutionEnvironment.createLocalEnvironment()
val props = new Properties()
props.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
                  "CentOS:9092,CentOS:9093,CentOS:9094")
props.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "g1")

env.addSource(new FlinkKafkaConsumer[String]("topic01",new SimpleStringSchema(),props))
.filter(line => line.startsWith("order"))
.map(line => line.replace("order","").trim)
.flatMap(user =>for(i <- user.split(" ")(1).split(",")) yield (user.split(" ")(0),i))
.print()

env.execute("word counts")

分组操作符

逻辑上将流分区为不相交的分区。具有相同Key的所有记录都分配给同一分区。在内部,keyBy()是使用散列分区实现的。

val env = StreamExecutionEnvironment.createLocalEnvironment()
val props = new Properties()
props.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "CentOS:9092,CentOS:9093,CentOS:9094")
props.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "g1")

//001 zhansan 苹果 4.5 2 2018-10-01
//003 lisi 机械键盘 800 1 2018-01-23
//002 zhansan 橘子 2.5 2 2018-11-22
env.addSource(new FlinkKafkaConsumer[String]("topic01",new SimpleStringSchema(),props))
.map(line => {
    val user = line.split(" ")(1)
    val cost= line.split(" ")(3).toDouble * line.split(" ")(4).toInt
    (user,cost)
})
.keyBy(0)
.reduce((item1,item2)=>(item1._1,item1._2+item2._2))
.print()
env.execute("order counts")

聚合操作符

Reduce

key数据流上的“滚动”减少。将当前元素与最后一个Reduce的值组合并产生新值。

val env = StreamExecutionEnvironment.createLocalEnvironment()

env.socketTextStream("localhost",9999)
    .flatMap(_.split("\\W+"))
    .map((_,1))
    .keyBy(0)
    .reduce((v1,v2)=>(v1._1,v1._2+v2._2))
    .print()

env.execute("reduce test")

Fold

有初始值的键控数据流上的“滚动”折叠。将当前元素与最后fold值组合并产生新值。该方法在未来版本即将废除。

val env = StreamExecutionEnvironment.createLocalEnvironment()

env.socketTextStream("localhost",9999)
        .flatMap(_.split("\\W+"))
        .map((_,1))
        .keyBy(0)
        .fold(("",0))((v1,v2)=>(v2._1,v1._2+v2._2))
        .print()

env.execute("fold test")

Aggregations

滚动Aggregations数据流上的聚合。 min和minBy之间的差异是min返回最小值,而minBy返回该字段中具有最小值的元素(max和maxBy相同)。

val env = StreamExecutionEnvironment.createLocalEnvironment()

env.socketTextStream("localhost",9999)
        .flatMap(_.split("\\W+"))
        .map((_,1))
        .keyBy(0)
        .sum(1)
        .print()

env.execute("aggregate test")

合并分支操作符

Union

流合并,必须保证合并的流的类型保持一致.

val env = StreamExecutionEnvironment.createLocalEnvironment()
val stream1: DataStream[String] = env.fromElements("a","b","c")
val stream2: DataStream[String] = env.fromElements("b","c","d")
stream1.union(stream2)
.print()
env.execute("union test")

Connect

将两个流数据连接在一起,和union类似但是不要求两个流元素的类型一致,但是在实现CoMapFunction[IN1,IN2,OUT]函数的时候,要求OUT类型必须一致,也就是说当经过该函数的处理,两个流最终的转换形式要一致。

val env = StreamExecutionEnvironment.createLocalEnvironment()
val s1: DataStream[String] = env.socketTextStream("CentOS",9999)
val s2: DataStream[String] = env.socketTextStream("CentOS",8888)

s1.connect(s2)
   .map(new CoMapFunction[String,String,String] {
        override def map1(value: String) = {
            value.split(" ")(0)+","+value.split(" ")(1)
        }
        override def map2(value: String) = {
            value.split(",")(0)+","+value.split(",")(1)
        }
    })
	.map(line =>  (line.split(",")(0),line.split(",")(1).toDouble))
    .keyBy(_._1)
    .sum(1)
    .print()
env.execute("connect demo")

Split/Select

使用Split算子先对一个Stream元素进行转换算子,将给定元素分发到对应的named stream中然后下游通过Select算子完成对各个named stream个性化处理.

val env = StreamExecutionEnvironment.createLocalEnvironment()
val split = env.socketTextStream("CentOS", 9999)
    .split(new OutputSelector[String] {
      override def select(value: String): lang.Iterable[String] = {
        var list = new util.ArrayList[String]()
        if (value.contains("error")) {
          list.add("error")
        } else {
          list.add("info")
        }
        return list
      }
    })
//选择分支流
split.select("error").map(t=> "ERROR "+t).print()
split.select("info").map(t=> "INFO "+t).print()

env.execute("split demo")

更多精彩内容关注

微信公众账号

你可能感兴趣的:(Flink实时计算)