So, I have a web platform that prints a JSON file per request containing some log data about that request. I can configure several rules about when should it log stuff, only at certain levels, etc...
因此,我有一个web平台,它为每个请求打印一个JSON文件,其中包含关于该请求的一些日志数据。我可以配置一些规则,关于它什么时候应该记录东西,只在某些级别,等等……
Now, I've been toying with the Logstash + Elasticsearch + Kibana3 stack, and I'd love to find a way to see those logs in Kibana. My question is, is there a way to make Logstash import these kind of files, or would I have to write a custom input plugin for it? I've searched around and for what I've seen, plugins are written in Ruby, a language I don't have experience with.
现在,我一直在玩弄loghide + Elasticsearch + Kibana3堆栈,我想找到一种方法来查看Kibana中的日志。我的问题是,是否有一种方法可以让loghide导入这些文件,或者我需要为它编写一个自定义的输入插件?我搜索了一下,发现插件是用Ruby编写的,这是一种我没有经验的语言。
3 个解决方案
#1
11
Logstash is just a tool for converting various kinds of syslog files into JSON and loading them into elasticsearch (or graphite, or... ).
loghidden只是一个工具,用于将各种syslog文件转换为JSON,并将它们加载到elasticsearch(或石墨,或…)中。
Since your files are already in JSON, you don't need logstash. You can upload them directly into elasticsearch using curl.
由于您的文件已经是JSON格式,所以不需要loghide。你可以使用curl将它们直接上传到弹性搜索中。
See Import/Index a JSON file into Elasticsearch
将一个JSON文件导入到弹性搜索中。
However, in order to work well with Kibana, your JSON files need to be at a minimum.
然而,为了与Kibana很好地合作,您的JSON文件需要至少是最少的。
-
Flat - Kibana does not grok nested JSON structs. You need a simple hash of key/value pairs.
Flat - Kibana没有嵌套的JSON结构。您需要一个简单的键/值对哈希。
-
Have a identifiable timestamp.
有一个可识别的时间戳。
What I would suggest is looking the JSON files logstash outputs and seeing if you can massage your JSON files to match that structure. You can do this in any language you like that supports JSON. The program jq
is very handy for filtering json from one format to another.
我的建议是查看JSON文件loghide输出并查看是否可以修改JSON文件以匹配该结构。可以用任何支持JSON的语言来实现这一点。程序jq非常方便地将json从一种格式过滤到另一种格式。
Logstash format - https://gist.github.com/jordansissel/2996677
Logstash格式——https://gist.github.com/jordansissel/2996677
jq - http://stedolan.github.io/jq/
金桥——http://stedolan.github.io/jq/
#2
15
Logstash is a very good tool for processing dynamic files.
loghide是一个非常好的处理动态文件的工具。
Here is the way to import your json file into elasticsearch using logstash:
下面是使用logstorage将json文件导入到弹性搜索的方法:
configuration file:
配置文件:
input
{
file
{
path => ["/path/to/json/file"]
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
}
}
filter
{
mutate
{
replace => [ "message", "%{message}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
output
{
elasticsearch {
protocol => "http"
codec => json
host => "localhost"
index => "json"
embedded => true
}
stdout { codec => rubydebug }
}
example of json file:
json文件的例子:
{"foo":"bar", "bar": "foo"}
{"hello":"world", "goodnight": "moon"}
Note the json need to be in one line. if you want to parse a multiline json file, replace relevant fields in your configuration file:
注意,json需要放在一行中。如果要解析多行json文件,请在配置文件中替换相关字段:
input
{
file
{
codec => multiline
{
pattern => '^\{'
negate => true
what => previous
}
path => ["/opt/mount/ELK/json/*.json"]
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
}
}
filter
{
mutate
{
replace => [ "message", "%{message}}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
#3
1
Logstash can import different formats and sources as it provides a lot of plugins. There are also other log collector and forwarder tools that can send logs to logstash such as nxlog, rsyslog, syslog-ng, flume, kafka, fluentd, etc. From what I've heard most people use nxlog on windows (though it works on linux equally well) in combination with the ELK stack because of its low resource footprint. (Disclaimer: I'm affiliated with the project)
loghide可以导入不同的格式和源,因为它提供了很多插件。还有其他日志收集器和货代的工具,可以将日志发送到logstash nxlog等rsyslog,syslog-ng,水槽,卡夫卡,fluentd等等。据我所知,大多数人使用nxlog在windows(尽管它同样适用于linux)结合麋鹿因为其低资源占用堆栈。(免责声明:我隶属于该项目)
#1
11
Logstash is just a tool for converting various kinds of syslog files into JSON and loading them into elasticsearch (or graphite, or... ).
loghidden只是一个工具,用于将各种syslog文件转换为JSON,并将它们加载到elasticsearch(或石墨,或…)中。
Since your files are already in JSON, you don't need logstash. You can upload them directly into elasticsearch using curl.
由于您的文件已经是JSON格式,所以不需要loghide。你可以使用curl将它们直接上传到弹性搜索中。
See Import/Index a JSON file into Elasticsearch
将一个JSON文件导入到弹性搜索中。
However, in order to work well with Kibana, your JSON files need to be at a minimum.
然而,为了与Kibana很好地合作,您的JSON文件需要至少是最少的。
-
Flat - Kibana does not grok nested JSON structs. You need a simple hash of key/value pairs.
Flat - Kibana没有嵌套的JSON结构。您需要一个简单的键/值对哈希。
-
Have a identifiable timestamp.
有一个可识别的时间戳。
What I would suggest is looking the JSON files logstash outputs and seeing if you can massage your JSON files to match that structure. You can do this in any language you like that supports JSON. The program jq
is very handy for filtering json from one format to another.
我的建议是查看JSON文件loghide输出并查看是否可以修改JSON文件以匹配该结构。可以用任何支持JSON的语言来实现这一点。程序jq非常方便地将json从一种格式过滤到另一种格式。
Logstash format - https://gist.github.com/jordansissel/2996677
Logstash格式——https://gist.github.com/jordansissel/2996677
jq - http://stedolan.github.io/jq/
金桥——http://stedolan.github.io/jq/
#2
15
Logstash is a very good tool for processing dynamic files.
loghide是一个非常好的处理动态文件的工具。
Here is the way to import your json file into elasticsearch using logstash:
下面是使用logstorage将json文件导入到弹性搜索的方法:
configuration file:
配置文件:
input
{
file
{
path => ["/path/to/json/file"]
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
}
}
filter
{
mutate
{
replace => [ "message", "%{message}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
output
{
elasticsearch {
protocol => "http"
codec => json
host => "localhost"
index => "json"
embedded => true
}
stdout { codec => rubydebug }
}
example of json file:
json文件的例子:
{"foo":"bar", "bar": "foo"}
{"hello":"world", "goodnight": "moon"}
Note the json need to be in one line. if you want to parse a multiline json file, replace relevant fields in your configuration file:
注意,json需要放在一行中。如果要解析多行json文件,请在配置文件中替换相关字段:
input
{
file
{
codec => multiline
{
pattern => '^\{'
negate => true
what => previous
}
path => ["/opt/mount/ELK/json/*.json"]
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
}
}
filter
{
mutate
{
replace => [ "message", "%{message}}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
#3
1
Logstash can import different formats and sources as it provides a lot of plugins. There are also other log collector and forwarder tools that can send logs to logstash such as nxlog, rsyslog, syslog-ng, flume, kafka, fluentd, etc. From what I've heard most people use nxlog on windows (though it works on linux equally well) in combination with the ELK stack because of its low resource footprint. (Disclaimer: I'm affiliated with the project)
loghide可以导入不同的格式和源,因为它提供了很多插件。还有其他日志收集器和货代的工具,可以将日志发送到logstash nxlog等rsyslog,syslog-ng,水槽,卡夫卡,fluentd等等。据我所知,大多数人使用nxlog在windows(尽管它同样适用于linux)结合麋鹿因为其低资源占用堆栈。(免责声明:我隶属于该项目)