配置文件为:logstash.yml
需要自己新建conf
文件,设置input
,filter
和output
,文件结构如下,自带的logstash-sample.conf
内容如下
input {
}
filter {
}
output {
}
启动命令
bin/logstash -f config/logstash.conf
https://www.elastic.co/guide/en/logstash/current/input-plugins.html
input {
stdin { }
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout { }
}
input {
# 从文件读取日志信息
file {
path => "/xxx/demolog/logs/myapp-info.log"
type => "ghn"
start_position => "beginning"
}
}
output {
stdout { codec => rubydebug }
}
input {
# 从文件读取日志信息
file {
path => "/xxx/demolog/log/demolog-*.log"
type => "ghn"
start_position => "beginning"
}
}
output {
# 输出到 elasticsearch
elasticsearch {
hosts => ["localhost:9200"]
user => "elastic"
password => "xxxxxx"
ssl => "true"
cacert => "/xxx/elk/logstash-8.9.1/config/certs/http_ca.crt"
index => "ghn-%{+YYYY.MM.dd}"
}
stdout { }
}
配合springboot/springcloud使用
官方github:https://github.com/logfellow/logstash-logback-encoder
在pom.xml添加依赖
net.logstash.logback
logstash-logback-encoder
7.4
在logback-spring.xml添加配置
127.0.0.1:4560
input {
# 从文件读取日志信息
tcp {
host => "0.0.0.0"
mode => "server"
port => 4560
codec => json_lines
}
}
output {
# 输出到 elasticsearch
elasticsearch {
hosts => ["localhost:9200"]
user => "elastic"
password => "xxxxxx"
ssl => "true"
cacert => "xxx/logstash-8.9.1/config/certs/http_ca.crt"
index => "ghn-%{+YYYY.MM.dd}"
}
stdout { }
# stdout { codec => rubydebug }
}
filebeat-8.9.1-darwin-aarch64/filebeat.yml
,修改如下部分,指定log位置为springboot的目录filebeat.inputs:
- type: filestream
enabled: true
paths:
- /xxx/xxx/*.log
./filebeat -e -c filebeat.yml -d "publish"
与logstash建立了连接,启动成功
input {
beats {
port => 5044
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
user => "elastic"
password => "xxxxxx"
ssl => "true"
cacert => "/xxxx/logstash-8.9.1/config/certs/http_ca.crt"
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
bin/logstash -f config/logstash-filebeat.conf
kibana
官方文档:https://www.elastic.co/guide/en/logstash/current/use-filebeat-modules-kafka.html
output.kafka:
hosts: ["localhost:9092"]
topic: "filebeat"
codec.json:
pretty: false
./filebeat -e -c filebeat.yml -d "publish"
input {
kafka {
bootstrap_servers => ["localhost:9092"]
topics => ["filebeat"]
codec => json
}
}
output {
# 输出到 elasticsearch
elasticsearch {
hosts => ["localhost:9200"]
user => "elastic"
password => "xxxxxx"
ssl => "true"
cacert => "/xxx/elk/logstash-8.9.1/config/certs/http_ca.crt"
index => "ghn-%{+YYYY.MM.dd}"
}
stdout { }
# stdout { codec => rubydebug }
}
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html