This week Logstash 1.2.0 has been released with exciting new features. Logstash is (from the Website): "[...] a tool for managing events and logs. You can use it to collect logs, parse them, and store them for later use (like, for searching). [...]"
The full Changelog can be found on [Github][1], here is a short list with some more details:
- New JSON schema
The annoying @ has been removed from all fields, except @timestamp and @version (this field includes the JSON schema version for future schema changes). This change makes it look much nicer on webfrontends like Kibana. Also the "field." prefix has been removed, all fields are now in the "global" namespace. More information on the schema change can be found in the Jira bugtracker: [LOGSTASH-675][2] - Kibana 3 included
As a replacement for the old integrated webinterface, Kibana 3 is now part of the Logstash distribution and can be started with the "web" parameter - Conditionals (if, then, else, else if) in the configuration
Versions prior to 1.2.0 used the "type" and "tags" parameter of filter and output plugins to select which log to process with a particular filter or output. This is now obsolete. The new way is to use conditionals inside the configuration file (more on that later!) - Field references
Fields can now be referenced with [ and ] around it. F.e. [field1] or [field1][subfield2] - Elasticsearch 0.90.3
The integrated Elasticsearch client and server have been upgraded to 0.90.3. This brings much more features and performance and makes Logstash compatible again to the latest and greatest Elasticsearch version - New plugin type: codec
This plugin type is used to decode the incoming data (input plugins) or decode outgoing data (output plugins) - Plugin status is now plugin milestone
More on that can be found on the official documentation website "[Plugin Milestones][3]"
These are the most important changes. Many more changes were made to inputs, filters and outputs (see [Changelog][1]).
Upgrade hints
Logstash wants to be backwards compatible. You can just replace the jar file with the new version, restart Logstash with the same configuration file as used for the older version and everything should be running again. If not, then it's a bug which should be reported. Note: Configuration options which were marked as deprecated on older version are most likely not available anymore in newer versions.
To make use of all the shiny new features, the configuration needs to be updated. Here are some hints:
Using conditionals
Filters and outputs used the "type" and "tags" parameter to select which log needs to be processed. This can now be done with conditionals. See the [official documentation][4] for more details. And here are some examples:
Example 1: Filter on type (old way)
input {
stdin { type => "stdinput" }
}
output {
stdout {
debug => true
type => "stdinput"
}
}
This will output: "Deprecated config item "type" set in LogStash::Outputs::Stdout"
Example 1: Filter on type (new way)
input {
stdin { type => "stdinput" }
}
output {
if [type] == "stdinput" {
stdout {
debug => true
}
}
}
Example 2: Filter on tags (old way)
input {
stdin {
type => "stdinput"
tags => [ "tag1", "tag2" ]
}
}
output {
stdout {
debug => true
tags => "tag1"
}
}
This will output: "Deprecated config item "tags" set in LogStash::Outputs::Stdout"
Example 2: Filter on tags (new way)
input {
stdin {
type => "stdinput"
tags => [ "tag1", "tag2" ]
}
}
output {
if "tag1" in [tags] {
stdout {
debug => true
}
}
}
As you can see, with conditionals it's possible to do the same things as before, but much flexibler.
One thing that's missing is the condition "not in". It's not possible to use f.e. "if "tag1" !in [tags]". The workaround I found is to use "else":
if "_grokparsefailure" in [tags] { } else {
# YOUR CODE
}
Schema update: Field names and references
All fields (except @version and @timestamp) lost their @. So all filters which references fields must be updated to reflect this change (simply remove the @ before). Also take care of the removal of the "fields." namespace and the new field referencing ('[foo][bar]' instead of 'foo.bar').
This also affects the Elasticsearch index (keyword mappings) and Kibana, they also need some update love.
Reindexing the old data to reflect this change can be done with the Elasticsearch input using the codec "oldlogstashjson".
Puppet module
If you are a Puppet user, I'm sure you want to use the official Puppet module which can be found on [Github][5]. At this time it's not yet ready for the new features of 1.2.0. Therefore I just use file delivery to deliver the configuration files.
The Logstash Book
It has been updated to reflect version 1.2.0. I can really suggest this book, it covers all important things one must know about Logstash and it's ecosystem. See the official [Website][6].
Update 1.2.1
Shortly after 1.2.0 a bugfix release 1.2.1 has been released ([Changelog][7]). It fixes some bugs and adds the "not in" condition.
So it's now possible to exclude logs with the tag "_grokparsefailure" as follows:
if "_grokparsefailure" not in [tags] {
# YOUR CODE
}
- 1: https://github.com/logstash/logstash/blob/master/CHANGELOG
- 2: https://logstash.jira.com/browse/LOGSTASH-675
- 3: http://logstash.net/docs/1.2.0/plugin-milestones
- 4: http://logstash.net/docs/1.2.0/configuration#conditionals
- 5: https://github.com/logstash/puppet-logstash
- 6: http://www.logstashbook.com/
- 7: https://github.com/logstash/logstash/blob/master/CHANGELOG#L1-38