使用flume的一个例子

新项目中需要使用到hadoop和vertica,使用flume把数据加载到hadoop中,我做了一个例子,

即监控一个sharefolder,如果里面有文件,则会文件load到hadoop.

开启Flume agent:

./flume-ng agent -n agent-1  -c conf -f /home/yaxiaohu/flumeconf/evantest.conf

以下内容是evantest.conf

 

 

agent-1.channels = ch-1
agent-1.sources = src-1
agent-1.sinks =hdfs-sink

agent-1.channels.ch-1.capacity = 1000
agent-1.channels.ch-1.transactionCapacity = 100

agent-1.channels.ch-1.type = memory
agent-1.sources.src-1.channels=ch-1
agent-1.sources.hdfs-sink.channel=ch-1

agent-1.sources.src-1.type = spooldir
agent-1.sources.src-1.channels = ch-1
agent-1.sources.src-1.spoolDir = /home/yaxiaohu/flumetest
agent-1.sources.src-1.fileHeader = true
agent-1.sources.src-1.deletePolicy = immediate

agent-1.sinks.hdfs-sink.type = hdfs
agent-1.sinks.hdfs-sink.channel = ch-1
agent-1.sinks.hdfs-sink.hdfs.path = /FAA/flume/%y-%m-%d/%H%M
agent-1.sinks.hdfs-sink.hdfs.filePrefix = %{Evan}
agent-1.sinks.hdfs-sink.hdfs.round = true
agent-1.sinks.hdfs-sink.hdfs.roundValue = 10
agent-1.sinks.hdfs-sink.hdfs.roundUnit = minute
agent-1.sinks.hdfs-sink.hdfs.useLocalTimeStamp=True

 

 

你可能感兴趣的:(Flume)