java+logstash

写在前面

在前面一篇文章中,我们介绍了如何在linux上搭建logstash+kafka的环境(logstash+kafka),在这篇中将介绍如何连通java日志文件和logstash。

日志分类

一般在java开发过程中最常用到的两种日志工具:

  • log4j : Apache的开源项目
  • logback : 可以说是log4j的改良版,和上者同一个作者

下面我们分别介绍这两种日志工具如何来和logstash连通,实现日志的收集。

log4j

log4j相对来说比较简单,它可以有两种配置方式:properties文件和xml文件

  • properties格式
log4j.rootLogger=DEBUG, logstash
##SocketAppender
log4j.appender.logstash=org.apache.log4j.net.SocketAppender
##logstash等待接收的端口
log4j.appender.logstash.Port=4560
##logstash服务所在机器的ip
log4j.appender.logstash.RemoteHost=logstash_hostname
log4j.appender.logstash.ReconnectionDelay=60000
log4j.appender.logstash.LocationInfo=true
##日志准入级别
log4j.appender.logstash.Threshold=DEBUG
  • xml格式

新增一个appender:

"LOGSTASH" class="org.apache.log4j.net.SocketAppender">
    ##logstash服务器IP
    <param name="RemoteHost" value="logstash_hostname" />
    <param name="ReconnectionDelay" value="60000" />
    <param name="LocationInfo" value="true" />
    ##日志准入等级
    <param name="Threshold" value="DEBUG" />

加入需要监控的logger中

"com.springapp.mvc" additivity="false">
        class="org.apache.log4j.Level" value="debug"/>
        ref ref="LOGSTASH"/>
        ref ref="LOGFILE"/>

对应的logstash的conf目录中新建log4j-es.conf文件,配置如下:

input {
    log4j {
        host => "168.10.10.69"
        port => 8801
    }
}


output {
    stdout {
      codec => rubydebug
    }
    elasticsearch{
        hosts => ["168.10.10.69:9200"]
        index => "log4j-%{+YYYY.MM.dd}"
        document_type => "log4j_type"
    }
}

log4j产生json格式的数据,格式如下:

{
        "message" => "Sorry, something wrong!",
       "@version" => "1",
     "@timestamp" => "2015-07-02T13:24:45.727Z",
           "type" => "log4j-json",
           "host" => "127.0.0.1:52420",
           "path" => "HelloExample",
       "priority" => "ERROR",
    "logger_name" => "HelloExample",
         "thread" => "main",
          "class" => "HelloExample",
           "file" => "HelloExample.java:9",
         "method" => "main",
    "stack_trace" => "java.lang.ArithmeticException: / by zero\n\tat HelloExample.divide(HelloExample.java:13)\n\tat HelloExample.main(HelloExample.java:7)"
}

只需要在日志配置文件中加入对应的appender即可,是不是so easy~

logback

logback相对比较复杂,我们需要用到git上的一个开源项目logstash-logback-encoder 来进行日志的输出,首先我们引入maven依赖:

<dependency> 
    <groupId>net.logstash.logbackgroupId>
    <artifactId>logstash-logback-encoderartifactId>
    <version>4.9version>
dependency>

还要保证我们的项目中有如下的依赖:

  • jackson-databind / jackson-core / jackson-annotations
  • logback-core
  • logback-classic (required for logging LoggingEvents)
  • logback-access (required for logging AccessEvents)
  • slf4j-api

在logback.xml文件中添加一个appender:

 
  
<appender name="logStash" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
    <destination>172.10.10.69:8801destination>
    <encoder class="net.logstash.logback.encoder.LogstashEncoder" />
    <keepAliveDuration>15 minuteskeepAliveDuration>
appender>

加入到需要监控的logger中:

<logger name="com.maidou.controller" level="ERROR">
    <appender-ref ref="logStash"/>
logger>

logstash-logback-encoder 中还提供很多种协议,多种encoder,以及多目标配置等等功能,有需要的可以自己学习README

在logstash配置文件logback-es.conf中可以有多种交互方式,这里我们选择tcp协议:

input {
  tcp {
    host => "168.10.10.69" 
    port => 8801
    mode => "server"
    tags => ["tags"]
    codec => json_lines
  }
}


output {
    stdout {
      codec => rubydebug
    }
    elasticsearch{
        hosts => ["168.10.10.69:9200"]
        index => "logback-%{+YYYY.MM.dd}"
        document_type => "logback_type"
    }
}

按照上面的配置,输入的日志如下:

{
     "@timestamp" => "2016-09-13T07:05:06.480Z",
       "@version" => 1,
        "message" => "HelloController exception",
    "logger_name" => "com.springapp.mvc.HelloController",
    "thread_name" => "http-bio-8080-exec-1",
          "level" => "ERROR",
    "level_value" => 40000,
    "stack_trace" => "java.lang.ArithmeticException: / by zero",
       "HOSTNAME" => "gejunqingdeMacBook-Pro.local",
           "host" => "192.168.1.192",
           "port" => 57635,
           "tags" => [
        [0] "tags"
    ]
}

配置和log4j相似,只是需要引入额外的jar包,所以繁琐一点。

小结

至此,我们已经实现了log-->logstash-->kafka的配置,再加上邮件功能,我们的日志监控发邮件的系统就完成啦。

你可能感兴趣的:(kafka,Spring,Boot,lucene,ELK)