Flume把网络流量存入日志文件

1.创建agent配置文件
把下列内容存入agent1.conf,并保存到Flume的工作目录/opt/flume/bin下面

agent1.sources = netsource
agent1.sinks = logsink
agent1.channels = memorychannel

agent1.sources.netsource.type = netcat
agent1.sources.netsource.bind = localhost
agent1.sources.netsource.port = 3000

agent1.sinks.logsink.type = logger

agent1.channels.memorychannel.type = memory
agent1.channels.memorychannel.capacity = 1000
agent1.channels.memorychannel.transactionCapacity = 100

agent1.sources.netsource.channels = memorychannel
agent1.sinks.logsink.channel = memorychannel

2.启动一个flume代理

caiyong@caiyong:/opt/flume/bin$ flume-ng agent --conf conf --conf-file agent1.conf --name agent1
Info: Including Hadoop libraries found via (/opt/hadoop/bin/hadoop) for HDFS access

Info: Excluding /opt/hadoop/libexec/../lib/slf4j-api-1.4.3.jar from classpath
Info: Excluding /opt/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from classpath
+ exec /usr/java/jdk/bin/java -Xmx20m -cp 'conf:/opt/flume/lib/*:/opt/hadoop/conf:/usr/java/jdk/lib/tools.jar:/opt/hadoop/libexec/..:/opt/hadoop/libexec/../hadoop-core-1.2.1.jar:/opt/hadoop/libexec/../lib/asm-3.2.jar:/opt/hadoop/libexec/../lib/aspectjrt-1.6.11.jar:/opt/hadoop/libexec/../lib/aspectjtools-1.6.11.jar:/opt/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/opt/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop/libexec/../lib/commons-cli-1.2.jar:/opt/hadoop/libexec/../lib/commons-codec-1.4.jar:/opt/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/opt/hadoop/libexec/../lib/commons-configuration-1.6.jar:/opt/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/opt/hadoop/libexec/../lib/commons-digester-1.8.jar:/opt/hadoop/libexec/../lib/commons-el-1.0.jar:/opt/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/opt/hadoop/libexec/../lib/commons-io-2.1.jar:/opt/hadoop/libexec/../lib/commons-lang-2.4.jar:/opt/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/opt/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/opt/hadoop/libexec/../lib/commons-math-2.1.jar:/opt/hadoop/libexec/../lib/commons-net-3.1.jar:/opt/hadoop/libexec/../lib/core-3.1.1.jar:/opt/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/opt/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/opt/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/opt/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/opt/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/opt/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/opt/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/opt/hadoop/libexec/../lib/jdeb-0.8.jar:/opt/hadoop/libexec/../lib/jersey-core-1.8.jar:/opt/hadoop/libexec/../lib/jersey-json-1.8.jar:/opt/hadoop/libexec/../lib/jersey-server-1.8.jar:/opt/hadoop/libexec/../lib/jets3t-0.6.1.jar:/opt/hadoop/libexec/../lib/jetty-6.1.26.jar:/opt/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/opt/hadoop/libexec/../lib/jsch-0.1.42.jar:/opt/hadoop/libexec/../lib/junit-4.5.jar:/opt/hadoop/libexec/../lib/kfs-0.2.2.jar:/opt/hadoop/libexec/../lib/log4j-1.2.15.jar:/opt/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/opt/hadoop/libexec/../lib/oro-2.0.8.jar:/opt/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/opt/hadoop/libexec/../lib/xmlenc-0.52.jar:/opt/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/opt/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar' -Djava.library.path=:/opt/hadoop/libexec/../lib/native/Linux-i386-32 org.apache.flume.node.Application --conf-file agent1.conf --name agent1
15/03/14 13:11:00 INFO node.PollingPropertiesFileConfigurationProvider: Configuration provider starting
15/03/14 13:11:00 INFO node.PollingPropertiesFileConfigurationProvider: Reloading configuration file:agent1.conf
15/03/14 13:11:00 INFO conf.FlumeConfiguration: Processing:logsink
15/03/14 13:11:00 INFO conf.FlumeConfiguration: Processing:logsink
15/03/14 13:11:00 INFO conf.FlumeConfiguration: Added sinks: logsink Agent: agent1
15/03/14 13:11:00 INFO conf.FlumeConfiguration: Post-validation flume configuration contains configuration for agents: [agent1]
15/03/14 13:11:00 INFO node.AbstractConfigurationProvider: Creating channels
15/03/14 13:11:00 INFO channel.DefaultChannelFactory: Creating instance of channel memorychannel type memory
15/03/14 13:11:00 INFO node.AbstractConfigurationProvider: Created channel memorychannel
15/03/14 13:11:00 INFO source.DefaultSourceFactory: Creating instance of source netsource, type netcat
15/03/14 13:11:00 INFO sink.DefaultSinkFactory: Creating instance of sink: logsink, type: logger
15/03/14 13:11:00 INFO node.AbstractConfigurationProvider: Channel memorychannel connected to [netsource, logsink]
15/03/14 13:11:01 INFO node.Application: Starting new configuration:{ sourceRunners:{netsource=EventDrivenSourceRunner: { source:org.apache.flume.source.NetcatSource{name:netsource,state:IDLE} }} sinkRunners:{logsink=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@1b688c0 counterGroup:{ name:null counters:{} } }} channels:{memorychannel=org.apache.flume.channel.MemoryChannel{name: memorychannel}} }
15/03/14 13:11:01 INFO node.Application: Starting Channel memorychannel
15/03/14 13:11:01 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: CHANNEL, name: memorychannel: Successfully registered new MBean.
15/03/14 13:11:01 INFO instrumentation.MonitoredCounterGroup: Component type: CHANNEL, name: memorychannel started
15/03/14 13:11:01 INFO node.Application: Starting Sink logsink
15/03/14 13:11:01 INFO node.Application: Starting Source netsource
15/03/14 13:11:01 INFO source.NetcatSource: Source starting
15/03/14 13:11:01 INFO source.NetcatSource: Created serverSocket:sun.nio.ch.ServerSocketChannelImpl[/127.0.0.1:3000]


3.在另一个窗口中,远程连接至本地主机的3000端口,然后输入一些文本
caiyong@caiyong:~$ telnet localhost 3000
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
hello
OK
this is a test
OK

再在原来的窗口可以看到末尾增加了如下数据:
15/03/14 13:11:41 INFO sink.LoggerSink: Event: { headers:{} body: 68 65 6C 6C 6F 20 0D                            hello . }
15/03/14 13:11:47 INFO sink.LoggerSink: Event: { headers:{} body: 74 68 69 73 20 69 73 20 61 20 74 65 73 74 0D    this is a test. }


你可能感兴趣的:(Flume,Flume网络流量,Flume日志文件)