简介:启动报错·The following config files contains non-ascii characters
but are not UTF-8 encoded
刚刚下载的logstash,解压后配置J:\elasticsearch\logstash-6.1.0\first-pipeline.conf
input {
beats {
port => "5044"
codec => plain {
charset => "UTF-8"
}
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
output {
elasticsearch {
hosts => "localhost:9200" # es服务器
index => "aa-%{+YYYY.MM.dd}" # es服务器索引格式
document_type => "wjb_log"
}
}
J:\elasticsearch\logstash-6.1.0\bin>logstash.bat -f ../first-pipeline.conf --config.reload.automatic
Sending Logstash's logs to J:/elasticsearch/logstash-6.1.0/logs which is now configured via log4j2.properties
出现'findstr' 不是内部或外部命令,也不是可运行的程序或批处理文件
[ERROR] 2018-03-07 23:01:03.355 [Ruby-0-Thread-1: J:/elasticsearch/logstash-6.1.0/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] sourceloader - Could not fetch
all the sources {:exception=>LogStash::ConfigLoadingError, :message=>"The following config files contains non-ascii characters
but are not UTF-8 encoded [\"J:/elasticsearch/logstash-6.1.0/first-pipeline.conf\"]",
:backtrace=>["J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/config/source/local.rb:85:in `read'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/config/source/local.rb:96:in `read'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/config/source/local.rb:192:in `local_pipeline_configs'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/config/source/local.rb:163:in `pipeline_configs'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/config/source_loader.rb:59:in `block in fetch'",
"org/jruby/RubyArray.java:2481:in `collect'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/config/source_loader.rb:58:in `fetch'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/agent.rb:148:in `converge_state_and_update'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/agent.rb:90:in `execute'",
"J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/runner.rb:343:in `block in execute'",
"J:/elasticsearch/logstash-6.1.0/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2018-03-07T23:01:03,737][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
解决:’findstr’ 不是内部或外部命令
参考:http://blog.csdn.net/qq_21383435/article/details/79490863
第二个错:The following config files contains non-ascii characters
这个文件不是utf-8格式
but are not UTF-8 encoded
重新运行:J:\elasticsearch\logstash-6.1.0\bin>logstash.bat -f ../first-pipeline.conf --config.reload.automatic
仍然报错,上面一大段,但是没有’findstr’ 不是内部或外部命令这个错
然后根据网友的建议,我用nodepad++打开,改变first-pipeline.conf文件的格式为以UTF-8无BOM编码
结果还是一样的错(标志A
)
重新运行还是一样的错
然后我去掉了里面的中文注释
最后我用记事本打开,结果另存为的试试发现是ASI码,不是utf-8,用记事本保存为utf-8后
重新运行报错如下
[2018-03-08T22:52:26,486][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 1, column 1 (byte 1) after ", :backtrace=>["J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "J:/elasticsearch/logstash-6.1.0/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'",
.....
这个是第一行第一列有问题,我查看配置看不出来,最后想起来记事本打开文件会自动加入BOM头,然后犹豫了一下,最后还是用notepad++ 以UTF-8无BOM编码
保存了一下,结果再次运行正确了,原来在标志A
保存成功
去掉中文注释和保存为以UTF-8无BOM编码
终于好了
J:\elasticsearch\logstash-6.1.0\bin>logstash.bat -f ../first-pipeline.conf --config.reload.automatic
Sending Logstash's logs to J:/elasticsearch/logstash-6.1.0/logs which is now configured via log4j2.properties
[2018-03-08T23:01:54,940][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"J:/elasticsearch/logstash-6.1.0/modules/fb_apache/configuration"}
[2018-03-08T23:01:55,023][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"J:/elasticsearch/logstash-6.1.0/modules/netflow/configuration"}
[2018-03-08T23:01:56,676][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-08T23:02:00,000][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.1.0"}
[2018-03-08T23:02:02,255][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-08T23:02:18,412][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>"localhost:9200", index=>"aa-%{+YYYY.MM.dd}", document_type=>"wjb_log", id=>"db08a7de921f4d4453f34c833e433c857c2cb0c4f874ba08658508f9be7bd69c">}
[2018-03-08T23:02:21,612][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-03-08T23:02:21,649][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-03-08T23:02:22,986][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-03-08T23:02:23,298][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-03-08T23:02:23,316][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-03-08T23:02:23,395][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-03-08T23:02:23,484][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-03-08T23:02:23,704][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2018-03-08T23:02:24,114][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-03-08T23:02:25,479][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"J:/elasticsearch/logstash-6.1.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-geoip-5.0.2-java/vendor/GeoLite2-City.mmdb"}
[2018-03-08T23:02:25,698][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x7506e98f run>"}
[2018-03-08T23:02:27,515][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2018-03-08T23:02:27,749][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"}
[2018-03-08T23:02:28,237][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-03-08T23:02:28,529][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}