Logstash JSON filter to detect events

Logstash JSON filter to detect events

I wanted to filter out JSON encoded data coming from OSSEC client to logstash and then forward the parsed JSON to clients connected over websocket. This filter could be useful for detecting and sending alerts on occurrence of certain events, so I wrote a filter to parse only the JSON encoded data and discard any other data.

In case you do not have logstash installed then you need to use the below commands to install logstash as well websocket plugin for it:

echo ‘deb http://packages.elasticsearch.org/logstash/2.1/debian stable main’ | sudo tee /etc/apt/sources.list.d/logstash.list
sudo apt-get update
sudo apt-get install logstash

Once logstash is installed next is to install websocket output plugin, use the following command to install it:

cd /var/logstash/bin
./plugin install logstash-output-websocket

Logstash can be configured to accept input at any port or protocol, in the below configuration I have used tcp and udp layers on port 2500 to receive data from OSSEC client. Once we have the input data then we can perform filtering on it and then eventually configuring output to spit out everything to websocket. In order to do filter data create a a configuration file (logstash.conf) using the following command and then paste the below content in it:

cd /etc/logstash/conf
sudo gedit logstash.conf

input {
tcp {
type => syslog
port => 2500
udp {
type => syslog
port => 2500
filter {
mutate { gsub => [“message”, “[\\]”, “”] }
mutate { gsub => [“message”, “,\”message\”:\”{“, “,”] }
mutate { gsub => [“message”, “}\””, “”] }
mutate { gsub => [“message”, “ossec: {“, “\n{“] }
mutate { gsub => [ “message”, “\[“, “” ] }
mutate { gsub => [ “message”, “\]”, “” ] }
mutate { gsub => [ “message”, “},{“, “}\n{” ] }
split { terminator => “\n” }
json { source => “message” }
mutate { remove_field => [ “message” ] }
if “_jsonparsefailure” in [tags] { drop { } }
if “” in [component] {
if [numReboots] {
mutate {
add_tag => [ “Reboot” ]
replace => { “description” => “Reboot event.” }
if “authentication failed” in [description] {
mutate {
add_tag => [ “set-red” ]
output {
elasticsearch {
hosts => “localhost”
websocket { }

The above configuration defines input to be fed to logstash at tcp and udp input at port 2500 and then parse only json string from the input and send it to elasticsearch and websocket.

This is how the filter works, as soon as it gets data it uses multiple mutate filter plugin with gsub setting to remove any formatting errors and tries to format the message to a valid JSON message. Next it uses split filter plugin to separate multiple JSON messages into new lines if they are fed in a single line.

Following we use json filter plugin to actually filter the JSON messages so that the json string is broken down with tags and their corresponding values when sent to output. Next we remove the actual message by removing the message field using the remove_field setting of the mutate filter, since we have already parsed the JSON data. Then we check if we have been able to parse JSON successfully or not, if it was a failure then we simply drop the message otherwise we process it further to check for matching tags.

Finally using the if condition we check if our JSON message actually contains events which are required to send an alert, in the above case I am looking for a particular event from a particular IP address, if that event occurs the filter adds extra tags which acts as an identifier for the clients connected to logstash using websocket.

Once this is done we can test our setup using the following command:

sudo curl -XPOST -d ‘{“numReboots”:17,”originator”:3,”eventType”:2,”cookie”:67109120,”importance”:0,”eventId”:1,”srs_query”:1535,”srs_response”:3,”srs_sid”:4}’

After executing this command press ctrl-c to break and then open kibana in web browser, on opening kibana it should load dashboard with our message which we have sent using curl in the previous step.

Alternatively a websocket can be written in simple HTML document to check logstash filtering.

1 Comment

  1. I found your research very much worthy for young generation.

Leave a Reply