Flink1.8消费Kafka的数据(Demo)——01

本demo为Flink 1.8消费kafka的demo 。不多说直接上代码。
1、POM.XML


<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0modelVersion>

    <groupId>com.demo.bigdata.flinkScalagroupId>
    <artifactId>flinkScalaartifactId>
    <version>1.0-SNAPSHOTversion>

    <properties>
        <scala.version>2.11.8scala.version>
    properties>

    <repositories>
        <repository>
            <id>scala-tools.orgid>
            <name>Scala-Tools Maven2 Repositoryname>
            <url>http://scala-tools.org/repo-releasesurl>
        repository>
    repositories>

    <pluginRepositories>
        <pluginRepository>
            <id>scala-tools.orgid>
            <name>Scala-Tools Maven2 Repositoryname>
            <url>http://scala-tools.org/repo-releasesurl>
        pluginRepository>
    pluginRepositories>

    <dependencies>
        <dependency>
            <groupId>org.scala-langgroupId>
            <artifactId>scala-libraryartifactId>
            <version>${scala.version}version>
        dependency>
        <dependency>
            <groupId>junitgroupId>
            <artifactId>junitartifactId>
            <version>4.4version>
            <scope>testscope>
        dependency>
        <dependency>
            <groupId>org.specsgroupId>
            <artifactId>specsartifactId>
            <version>1.2.5version>
            <scope>testscope>
        dependency>

        <dependency>
            <groupId>org.apache.flinkgroupId>
            <artifactId>flink-coreartifactId>
            <version>1.8.0version>
            <scope>compilescope>
        dependency>

        
        <dependency>
            <groupId>org.apache.flinkgroupId>
            <artifactId>flink-connector-kafka-0.10_2.11artifactId>
            <version>1.8.0version>
            <scope> compilescope>
        dependency>

        
        <dependency>
            <groupId>org.apache.kafkagroupId>
            <artifactId>kafka_2.11artifactId>
            <version>0.10.2.0version>
            <scope>compilescope>
        dependency>

        
        <dependency>
            <groupId>org.apache.flinkgroupId>
            <artifactId>flink-streaming-java_2.11artifactId>
            <version>1.8.0version>
            <scope> compilescope>
        dependency>

    dependencies>

    <build>
        <sourceDirectory>src/main/scalasourceDirectory>
        <testSourceDirectory>src/test/scalatestSourceDirectory>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.pluginsgroupId>
                <artifactId>maven-compiler-pluginartifactId>
                <configuration>
                    <source>1.8source>
                    <target>1.8target>
                configuration>
            plugin>
            <plugin>
                <groupId>org.apache.maven.pluginsgroupId>
                <artifactId>maven-eclipse-pluginartifactId>
                <configuration>
                    <downloadSources>truedownloadSources>
                    <buildcommands>
                        <buildcommand>ch.epfl.lamp.sdt.core.scalabuilderbuildcommand>
                    buildcommands>
                    <additionalProjectnatures>
                        <projectnature>ch.epfl.lamp.sdt.core.scalanatureprojectnature>
                    additionalProjectnatures>
                    <classpathContainers>
                        <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINERclasspathContainer>
                        <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINERclasspathContainer>
                    classpathContainers>
                configuration>
            plugin>


        plugins>
    build>

project>

2、FlinkToKafka Demo

import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.api.datastream.DataStreamSource
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010


object FlinkToKafka {

  def main(args: Array[String]): Unit = {
    /** ---------------------------------------------------------------------------------------------------------------
      * kafka info
      */
    val zkCluster = "bigdata1:2181,bigdata2:2181,bigdata3:2181"
    val kafkaCluster = "bigdata1:9092,bigdata2:9092,bigdata3:9092"
    val topic = "Ec-demo"

    /** ---------------------------------------------------------------------------------------------------------------
      * flink env
      */
    val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment


    /** ---------------------------------------------------------------------------------------------------------------
      * create kafka stream
      */
    val props = new Properties()
    props.setProperty("bootstrap.servers", kafkaCluster)
    props.setProperty("zookeeper.connect", zkCluster)
    props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
    props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
    props.setProperty("group.id", "flink-test")

    /* ---------------------------------------------------------------------------------------------------------------
     * 构造流图,从Kafka读取数据并换行打印
     */
    val messageStream: DataStreamSource[String] = env.addSource(new FlinkKafkaConsumer010(
      topic, new SimpleStringSchema, props))

    messageStream.print()
    // 调用execute触发执行
    env.execute()

  }


}

3、测试
Flink1.8消费Kafka的数据(Demo)——01_第1张图片

kafka-console-producer.sh --broker-list ip1:9092,ip3:9092,ip2:9092 --topic Ec-demo

Flink1.8消费Kafka的数据(Demo)——01_第2张图片

你可能感兴趣的:(Flink)