flume-kafka配置详解

目标:通过flume抓取日志消息传递给kafka

flume和kafka相关配置见前序文章,本文主要讲解如何实现flume将消息传递给kafka。

1、配置flume conf文件,在flume/conf目录下创建conf文件,flume_kafka.conf

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

#define a memory channel called c1 on a1
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
#Describe/configure the source
a1.sources.r1.channels = c1
a1.sources.r1.type = exec
a1.sources.r1.command = tail -f /flume_test/log/logserver.log

#Describe the sink
a1.channels.channel1.type = org.apache.flume.channel.kafka.KafkaChannel
a1.channels.channel1.kafka.bootstrap.servers = master:9092,slave1:9092,slave2:9092
a1.channels.channel1.kafka.topic = test3
a1.channels.channel1.kafka.consumer.group.id = flume-consumer
2、启动flume

./bin/flume-ng agent --conf conf --conf-file ./conf/flume_kafka.conf  -n a1 -Dflume.root.logger=info,console

3、启动kafka,打开consumer

./bin/kafka-server-start.sh config/server.properties

./bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test3 --from-beginning

4、写入数据到/flume_test/log/logserver.log,即可观察kafka收到写入的数据;

 

注:关于flume kafka配置详见https://github.com/apache/flume/blob/trunk/flume-ng-doc/sphinx/FlumeUserGuide.rst

flume-kafka配置详解_第1张图片

你可能感兴趣的:(BigData)