LK可以说是当前对分布式服务器集群日志做汇总、分析、统计和检索操作的很好的一套系统了。而Spring Boot作为一套为微服务而生的框架,自然也免不了处理分布式日志的问题,通过ELK日志系统来处理日志还是很有意义的。在这套系统中,E即为ElasticSearch,负责日志存储;L为LogStash,负责日志收集,并将日志信息写入ElasticSearch,K则为Kibana,负责将ElasticSearch中的日志数据进行可视化及分析检索操作。可以说将Spring Boot与ELK整合很大程度上相当于将Spring Boot与Logstash进行整合。将那么如何将Spring Boot与LogStash整合起来呢?
在Spring Boot当中,默认使用logback进行log操作。和其他日志工具如log4j一样,logback支持将日志数据通过提供IP地址、端口号,以Socket的方式远程发送。在Spring Boot中,通常使用logback-spring.xml来进行logback配置。
要想将logback与Logstash整合,必须引入logstash-logback-encoder包。该包的依赖如下:
net.logstash.logback
logstash-logback-encoder
5.3
将依赖添加好之后,就可以进行logback的配置了。logback-spring.xml配置如下:
INFO
${CONSOLE_LOG_PATTERN}
utf8
192.168.1.111:8081
logback的基础配置不多说,百度谷歌上一大把,这里我们只讨论与Logstash相关的部分。在上面的XML文件中的第4行至第7行即为Logstash的配置。destination为日志的发送地址,在配置Logstash的时候选择监听这个地址即可进行日志收集。这里我把Logstash部署在了本地的虚拟机上(其地址就是192.168.1.111),端口为8081。当然这里你需要把地址改成你自己的。下面的encoder是必选项。Console则是为了保证Spring Boot原始的日志配置不被覆盖。这里logback配置完毕。
Logstash方面,在安装目录下新建一个文件夹conf,在conf文件夹下创建文件logstash.conf,内容如下:
# Logstash configuration
# TCP -> Logstash -> Elasticsearch pipeline.
input {
tcp {
mode => "server"
host => "192.168.1.111" //尽量使用IP
port => 8081 //从本地的8081端口取日志
codec => json_lines //需要安装logstash-codec-json_lines插件
}
}
output {
elasticsearch {
hosts => ["http://192.168.1.111:9200"] //输出到ElasticSearch
index => "logstash-%{+YYYY.MM.dd}"
}
stdout { //若不需要在控制台中输出,此行可以删除
codec => rubydebug
}
}
如果你的Logstash没有安装logstash-codec-json_lines插件,通过以下命令安装:
[root@ecs-55e5 ~]# cd /usr/share/logstash/
[root@ecs-55e5 logstash]# ls
bin CONTRIBUTORS data Gemfile Gemfile.lock lib LICENSE.txt logstash-core logstash-core-plugin-api modules NOTICE.TXT tools vendor x-pack
[root@ecs-55e5 logstash]# cd bin
[root@ecs-55e5 bin]# ./logstash-plugin install logstash-codec-json_lines
Validating logstash-codec-json_lines
Installing logstash-codec-json_lines
Installation successful
[root@ecs-55e5 bin]#
启动Logstash 暴露出端口8081接受日志 :
[root@ecs-55e5 logstash]# logstash -f logstash.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-03-06 14:38:50.990 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-03-06 14:38:51.007 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.5.4"}
[INFO ] 2019-03-06 14:38:54.639 [Converge PipelineAction::Create] pipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[INFO ] 2019-03-06 14:38:55.095 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.1.111:9200/]}}
[WARN ] 2019-03-06 14:38:55.284 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://192.168.1.111:9200/"}
[INFO ] 2019-03-06 14:38:55.549 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>6}
[WARN ] 2019-03-06 14:38:55.553 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[INFO ] 2019-03-06 14:38:55.577 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://192.168.1.111:9200"]}
[INFO ] 2019-03-06 14:38:55.591 [Ruby-0-Thread-5: :1] elasticsearch - Using mapping template from {:path=>nil}
[INFO ] 2019-03-06 14:38:55.608 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[INFO ] 2019-03-06 14:38:55.644 [[main]>worker7] tcp - Starting tcp input listener {:address=>"localhost:8081", :ssl_enable=>"false"}
[INFO ] 2019-03-06 14:38:55.917 [Converge PipelineAction::Create] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#"}
[INFO ] 2019-03-06 14:38:55.952 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2019-03-06 14:38:56.152 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
大功告成!最终效果如下:
{
"logger_name" => "com.amt.hibei.sysframework.config.SysConfig",
"thread_name" => "main",
"@timestamp" => 2019-03-06T07:27:26.348Z,
"level_value" => 20000,
"host" => "182.148.112.187",
"port" => 58138,
"level" => "INFO",
"@version" => "1",
"message" => "============系统参数加载完成!=============="
}
{
"logger_name" => "com.amt.hibei.client.HibeiGameClientHiApplication",
"thread_name" => "main",
"@timestamp" => 2019-03-06T07:27:26.259Z,
"level_value" => 20000,
"host" => "182.148.112.187",
"port" => 58138,
"level" => "INFO",
"@version" => "1",
"message" => "Starting HibeiGameClientHiApplication on Amt-PC with PID 4256 (E:\\IdeWorkspace\\hibeigame\\hibeigame-client-HI\\target\\classes started by amt in E:\\IdeWorkspace\\hibeigame)"
}
log-bak.xml
INFO
${CONSOLE_LOG_PATTERN}
utf8
192.168.11.86:9250
UTC
{
"severity": "%level",
"service": "${springAppName:-}",
"trace": "%X{X-B3-TraceId:-}",
"span": "%X{X-B3-SpanId:-}",
"exportable": "%X{X-Span-Export:-}",
"pid": "${PID:-}",
"thread": "%thread",
"class": "%logger{40}",
"message": "%message",
"timeDate": "%d{yyyy-MM-dd HH:mm:ss.SSS}",
"stack_trace": %exception{5}异常信息,
"line":"%line"日志行号,
"logLevel":"%level",
"serviceName":"${spring.application.name}"项目名字,从配置文件中获取
}