Flink 1.9 Table API -kafkaSource

Flink 1.9 Table API -kafkaSource

​ 使用kafka作为flink的数据源对接flink Table,本次测试使用的是单节点的kafka以及flink,以下为一次简单的操作,包括kafka主题的创建、订阅、发布以及具体的小案例

kafka中主题的创建
[root@CentOSA kafka_2.11-2.1.0]# bin/kafka-topics.sh --create --topic sales --partitions 1 --replication-factor 1 --zookeeper CentOSA:2181
Created topic "sales"
查看已经创建好的主题
[root@CentOSA kafka_2.11-2.1.0]# bin/kafka-topics.sh --list  --zookeeper CentOSA:2181
__consumer_offsets
sales
消息的发布
[root@CentOSA kafka_2.11-2.1.0]# bin/kafka-console-producer.sh --topic sales --broker-list CentOSA:9092
消息的订阅
[root@CentOSA kafka_2.11-2.1.0]# bin/kafka-console-consumer.sh --topic sales --bootstrap-server CentOSA:9092 --from-beginning
demo
//创建样例类
case class ShopSales (product_id :String,category:String,product_name:String,sales:Double)


import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
import org.apache.flink.table.api.EnvironmentSettings
import org.apache.flink.table.api.scala._
import org.apache.flink.streaming.api.scala._
import org.apache.flink.table.api.scala.StreamTableEnvironment

object KafkaSource {
  def main(args: Array[String]): Unit = {
    // 创建一个使用 Blink Planner 的 TableEnvironment, 并工作在流模式
    val bsEnv = StreamExecutionEnvironment.getExecutionEnvironment
    val bsSettings =        EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build()
    val bsTEnv = StreamTableEnvironment.create(bsEnv, bsSettings)
    //配置kafka的连接参数
    val props = new Properties()
    props.setProperty("bootstrap.servers", "CentOSA:9092")
    props.setProperty("group.id", "g1")
    val salesData = bsEnv.addSource(new FlinkKafkaConsumer("sales",new SimpleStringSchema(),props))
        .map(_.split("\\s+"))
        .map(dt=>ShopSales(dt(0),dt(1),dt(2),dt(3).toDouble))
    // register the DataStream under the name "ShopSales"
    bsTEnv.registerDataStream("ShopSales", salesData, 'product_id, 'category, 'product_name, 'sales)
    val topNTable =bsTEnv.sqlQuery(
      """
        |SELECT product_id, category, product_name, sales FROM ShopSales
      """
        .stripMargin)
    bsTEnv.toRetractStream[(String, String,String, Double)](topNTable).print()
     //开始执行
    bsEnv.execute()
  }
}

你可能感兴趣的:(大数据,Flink,1.9,Table,API,&,SQL,Flink)