Logstash之output配置:
输出到file
配置conf:
input{
file{
path => "/usr/local/logstash-5.6.1/bin/spark-test-log.log"
type => "sparkfile"
start_position => "beginning"
}
}
filter{
grok{
patterns_dir => '/usr/local/logstash-5.6.1/patterns/selfpattern'
match => ["message", "%{DATE:date} %{SKYTIME:time} %{LOGLEVEL:loglevel} %{WORD:word}"]
}
}
output{
file{
path => "/tmp/%{+YYYY.MM.dd}-%{host}-file.txt"
}
}
运行,生成了文件:
文件里记录了数据:
输出到elasticsearch
配置conf:
input{
file{
path => "/usr/local/logstash-5.6.1/bin/spark-test-log.log"
type => "sparkfile"
start_position => "beginning"
}
}
filter{
grok{
patterns_dir => '/usr/local/logstash-5.6.1/patterns/selfpattern'
match => ["message", "%{DATE:date} %{SKYTIME:time} %{LOGLEVEL:loglevel} %{WORD:word}"]
}
}
output{
elasticsearch{
hosts => ["http://192.168.1.151:9200"]
index => "logstash_output-%{type}-%{+YYYY.MM.dd}"
document_type => "sparkfileType"
}
}
在es head 上可以看到index已经插入成功: