flume收集数据到kafka

a1.sources = s1
a1.channels=c1
a1.sinks=k1
 
a1.sources.s1.type =spooldir
a1.sources.s1.channels=c1
a1.sources.s1.spoolDir=/home/wang/a/flume/logs
a1.sources.s1.fileHeader=true
a1.channels = c1
a1.channels.c1.type =SPILLABLEMEMORY
a1.channels.c1.memoryCapacity = 10000
a1.channels.c1.overflowCapacity = 1000000
a1.channels.c1.byteCapacity = 800000
a1.channels.c1.checkpointDir =/home/wangfutai/a/flume/checkpoint
a1.channels.c1.dataDirs = /home/wangfutai/a/flume/data

a1.sinks.k1.channel = c1
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.topic = stremaingTest2
a1.sinks.k1.kafka.bootstrap.servers = wang:9092
a1.sinks.k1.kafka.flumeBatchSize = 20
a1.sinks.k1.kafka.producer.acks = 1 
a1.sinks.k1.kafka.producer.linger.ms = 1 

创建:kafka-topics.sh   --create --topic 'stremaingTest2' --zookeeper 'wangfutai:2181'\ --partitions 1 --replication-factor 1

消费者:kafka-console-consumer.sh --topic 'stremaingTest2' --zookeeper 'wangfutai:2181' --from-beginning  --skip-message-on-error

你可能感兴趣的:(flume,kafka)