Flink-从kafka读取数据,输出到mysql

目录

1、pom文件

2、自定义kafka数据源

3、输出到mysql代码

4、结果展示


1、pom文件



  4.0.0
  flinkemo
  flinkemo
  1.0-SNAPSHOT
  
    
      mysql
      mysql-connector-java
      8.0.30
    
    
      org.apache.flink
      flink-connector-kafka-0.11_2.12
      1.10.1
    
    
      org.apache.flink
      flink-scala_2.12
      1.10.1
    
    
    
      org.apache.flink
      flink-streaming-scala_2.12
      1.10.1
    
  
  
    
      
      
        net.alchim31.maven
        scala-maven-plugin
        3.4.6
        
          
            
            
              compile
            
          
        
      
      
        org.apache.maven.plugins
        maven-assembly-plugin
        3.0.0
        
          
            jar-with-dependencies
          
        
        
          
            make-assembly
            package
            
              single
            
          
        
      
    
  

2、自定义kafka数据源

代码:

object StreamKafkaToMysql {
  case class student(id: String, name: String)
  def main(args: Array[String]): Unit = {

    val env = StreamExecutionEnvironment.getExecutionEnvironment

    val properties = new Properties()
    properties.setProperty("bootstrap.servers", "node01:9092")
    properties.setProperty("group.id", "test-consumer-group")
    properties.setProperty("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
    properties.setProperty("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
    properties setProperty("auto.offset.reset", "latest")
    var stream3 = env.addSource(new FlinkKafkaConsumer011[String]("topic-test", new SimpleStringSchema(), properties))

    val value: DataStream[student] = stream3.map(x => {
      val strs: Array[String] = x.split(" ")
      val student = new student(strs(0), strs(1))
      student
    })

    value.print()

    env.execute("KafkaSouceReview")

  }

}

3、输出到mysql代码

class MysqlSink() extends RichSinkFunction[student]{
  var conn: Connection = _
  var insertStmt: PreparedStatement = _
  var updateStmt: PreparedStatement = _

  // open 主要是创建连接
  override def open(parameters: Configuration): Unit = {
    super.open(parameters)

    Class.forName("com.mysql.cj.jdbc.Driver")
    conn = DriverManager.getConnection("jdbc:mysql://localhost:3306/test?characterEncoding=utf8", "root", "root")
    insertStmt = conn.prepareStatement("INSERT INTO student_01 (id, name) VALUES (?, ?)")
    updateStmt = conn.prepareStatement("UPDATE student_01 SET name = ? WHERE id = ?")
  }

  //调用连接, 执行 sql
  override def invoke(value: student, context:
    SinkFunction.Context[_]): Unit = {
    print(value.id +"\n")
    updateStmt.setString(1, value.id)
    updateStmt.setString(2, value.name)
    updateStmt.execute()
    if (updateStmt.getUpdateCount == 0) {
      insertStmt.setString(1, value.id)
      insertStmt.setString(2, value.name)
      insertStmt.execute()
    }
  }

  override def close(): Unit = {
    insertStmt.close()
    updateStmt.close()
    conn.close()
  }
}

4、结果展示

在kafka客户端启动生产者:

./kafka-console-producer.sh --broker-list 192.168.81.129:9092 --topic topic-test

在生产者端输入数据

 控制台数据输出:

Flink-从kafka读取数据,输出到mysql_第1张图片

 查看mysql数据库数据:

Flink-从kafka读取数据,输出到mysql_第2张图片


结论:
This is the most wonderful day of my life,because I'm here with you now. 

你可能感兴趣的:(Flink,kafka,flink,mysql)