Flume 题:
1.在 master 节点安装启动 Flume 组件,打开 Linux Shell 运行 flume-ng 的帮助命令,查看 Flume-ng 的用法信息。
[root@master ~]# flume-ng help
Usage: /usr/hdp/2.6.1.0-129/flume/bin/flume-ng.distro [options]…
commands:
help display this help text
agent run a Flume agent
avro-client run an avro Flume client
password create a password file for use in flume config
version show Flume version info
global options:
–conf,-c use configs in directory
–classpath,-C append to the classpath
–dryrun,-d do not actually start Flume, just print the command
–plugins-path colon-separated list of plugins.d directories. See the
plugins.d section in the user guide for more details.
Default: $FLUME_HOME/plugins.d
-Dproperty=value sets a Java system property value
-Xproperty=value sets a Java -X option
agent options:
–conf-file,-f specify a config file (required)
–name,-n the name of this agent (required)
–help,-h display help text
avro-client options:
–rpcProps,-P RPC client properties file with server connection params
–host,-H hostname to which events will be sent
–port,-p port of the avro source
–dirname
Either --rpcProps or both --host and --port must be specified.
password options:
–outfile The file in which encoded password is stored
Note that if directory is specified, then it is always included first
in the classpath.
2.根据提供的模板 log-example.conf 文件,使用 Flume NG 工具收集 master节点的系统日志/var/log/secure,将收集的日志信息文件的名称以“xiandian-sec”为前缀,存放于 HDFS 文件系统的/1daoyun/file/flume 目录中,并且定义在 HDFS 中产生的文件的时间戳为 10 分钟。进行收集后,查询HDFS 文件系统中/1daoyun/file/flume 的列表信息。
[root@master ~]# flume-ng agent -c . -f /opt/log-example.conf -n a1
-Dflume.root.logger=INFO,console
[root@master ~]# hadoop fs -ls /1daoyun/file/flume
Found 19 items
-rw-r–r-- 3 root root 1172 2019-05-23 13:03 /1daoyun/file/flume/xiandian-sec.1558616567923
[root@master ~]#cat /opt/log-example.conf
# example.conf: A single-node Flume configuration
# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = exec
a1.sources.r1.command = tail -F /var/log/secure
a1.sources.r1.channels = c1
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
# Describe the sink
a1.sinks.k1.type = hdfs
a1.sinks.k1.channel = c1
a1.sinks.k1.hdfs.path =hdfs://master:8020/1daoyun/file/flume
a1.sinks.k1.hdfs.filePrefix = xiandian-sec
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute
3.根据提供的模板 hdfs-example.conf 文件,使用 Flume NG 工具设置 master节点的系统路径/opt/xiandian/为实时上传文件至 HDFS 文件系统的实时路径,设置 HDFS 文件系统的存储路径为/data/flume/,上传后的文件名保持不变,文件类型为 DataStream,然后启动 flume-ng agent。
[root@master ~]# flume-ng agent -c . -f /opt/hdfs-example.conf -n master
-Dflume.root.logger=INFO,console
[root@master ~]# hadoop fs -ls -R /data/flume
drwxr-xr-x - root root 0 2019-05-23 13:33 /data/flume/opt
drwxr-xr-x - root root 0 2019-05-23 13:33 /data/flume/opt/xiandian
-rw-r–r-- 3 root root 665 2019-05-23 13:33 /data/flume/opt/xiandian/log-example.conf.1558618396952
[root@master ~]# cat /opt/hdfs-example.conf
# example.conf: A single-node Flume configuration
# Name the components on this agent
master.sources = webmagic
master.sinks = k1
master.channels = c1
# Describe/configure the source
master.sources.webmagic.type = spooldir
master.sources.webmagic.fileHeader = true
master.sources.webmagic.fileHeaderKey = fileName
master.sources.webmagic.fileSuffix = .COMPLETED
master.sources.webmagic.deletePolicy = never
master.sources.webmagic.spoolDir = /opt/xiandian/
master.sources.webmagic.ignorePattern = ^$
master.sources.webmagic.consumeOrder = oldest
master.sources.webmagic.deserializer =org.apache.flume.sink.solr.morphline.BlobDeserializer$Builder
master.sources.webmagic.batchsize = 5
master.sources.webmagic.channels = c1
# Use a channel which buffers events in memory
master.channels.c1.type = memory
# Describe the sink
master.sinks.k1.type = hdfs
master.sinks.k1.channel = c1
master.sinks.k1.hdfs.path =hdfs://master:8020/data/flume/%{dicName}
master.sinks.k1.hdfs.filePrefix = %{fileName}
master.sinks.k1.hdfs.fileType = DataStream