flink连接kafka

1.启动zookeeper
在cmd中输入zkserver,成功启动页面 如下:
flink连接kafka_第1张图片
2.启动kafka
2.1启动
打开cmd,切换到D:\profession\kafka\kafka_2.11-2.4.0,输入

.\bin\windows\kafka-server-start.bat .\config\server.properties

显示的信息如下,则表示正常运行
flink连接kafka_第2张图片
2.2 创建topic
在D:\profession\kafka\kafka_2.11-2.4.0\bin\windows文件夹中”Shift+鼠标右键”点击空白处打开命令提示窗口

kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic wc10

2.3创建生产者
在D:\profession\kafka\kafka_2.11-2.4.0\bin\windows文件夹中”Shift+鼠标右键”点击空白处打开命令提示窗口

kafka-console-producer.bat --broker-list localhost:9092 --topic wc10

等待输入。

3.启动flink
主类KafkaSource代码如下:

package cn.doit.flink.test02;

import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;

import java.util.Properties;

/**
 * 从kafka中读取数据的Source,可以并行的Source,并且可以实现ExactlyOnce
 */
public class KafkaSource {

    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        Properties properties = new Properties();

        //指定kafka的Broker地址
        properties.setProperty("bootstrap.servers","localhost:9092");
        //指定组ID
        properties.setProperty("group.id","gwc10");
        //如果没有记录偏移量,第一次从最开始消费
        properties.setProperty("auto.offset.reset","earliest");
        //kafka的消费者不自动提交偏移量
        //properties.setProperty("enable.auto.commit","false");


        //kafkaSource
        FlinkKafkaConsumer kafkaSource = new FlinkKafkaConsumer("wc10", new SimpleStringSchema(), properties);

        DataStreamSource lines = env.addSource(kafkaSource);

        //sink
        lines.print();

        env.execute("KafkaSource");


    }
}

pom.xml内容为:



	4.0.0

	cn.doit.flink
	flink-java
	1.0-SNAPSHOT
	jar

	Flink Quickstart Job
	http://www.myorganization.org

	
		UTF-8
		1.9.1
		1.8
		2.11
		${java.version}
		${java.version}
	

	
		
			apache.snapshots
			Apache Development Snapshot Repository
			https://repository.apache.org/content/repositories/snapshots/
			
				false
			
			
				true
			
		
	

	
		
		
		
			org.apache.flink
			flink-java
			${flink.version}
			provided
		
		
			org.apache.flink
			flink-streaming-java_${scala.binary.version}
			${flink.version}
			provided
		

		

		

        
        
            org.apache.flink
            flink-connector-kafka_2.11
            1.10.0
        

		
		
		
			org.slf4j
			slf4j-log4j12
			1.7.7
			runtime
		
		
			log4j
			log4j
			1.2.17
			runtime
		



	

	
		

			
			
				org.apache.maven.plugins
				maven-compiler-plugin
				3.1
				
					${java.version}
					${java.version}
				
			

			
			
			
				org.apache.maven.plugins
				maven-shade-plugin
				3.0.0
				
					
					
						package
						
							shade
						
						
							
								
									org.apache.flink:force-shading
									com.google.code.findbugs:jsr305
									org.slf4j:*
									log4j:*
								
							
							
								
									
									*:*
									
										META-INF/*.SF
										META-INF/*.DSA
										META-INF/*.RSA
									
								
							
							
								
									cn.doit.flink.StreamingJob
								
							
						
					
				
			
		

		
			

				
				
					org.eclipse.m2e
					lifecycle-mapping
					1.0.0
					
						
							
								
									
										org.apache.maven.plugins
										maven-shade-plugin
										[3.0.0,)
										
											shade
										
									
									
										
									
								
								
									
										org.apache.maven.plugins
										maven-compiler-plugin
										[3.1,)
										
											testCompile
											compile
										
									
									
										
									
								
							
						
					
				
			
		
	

	
	
	
	
		
			add-dependencies-for-IDEA

			
				
					idea.version
				
			

			
				
					org.apache.flink
					flink-java
					${flink.version}
					compile
				
				
					org.apache.flink
					flink-streaming-java_${scala.binary.version}
					${flink.version}
					compile
				
			
		
	



我启动后,报Cannot resolve org.apache.kafka:kafka-clients:2.2.0,于是我导入了kafka-clients-0.10.1.1.jar。
正常启动后在Kafka生产者输入信息,在IDEA控制台可以显示
flink连接kafka_第3张图片

你可能感兴趣的:(flink连接kafka)