How to configure Logstash to receive http events and visualize in kibana

Ujitha Iroshan
4 min readJun 6, 2021

Most common use case where we use ELK is to collect and observe logs, but there are some use cases which we need to observe and monitor http events whereas instead of raw logs we need to send http events and observe them in kibana. In this article we are showing how to configure ELK to receive and show events.

First we need to download and setup the each components of ELK. Elasticsearc, Logstash and Kibana

Elasticsearch

Elasticsearch download page provide all the details for how to download and install Elasticsearch depend on your environment.

Logstash

Logstash download page provide all the details for how to download and install Logstash depend on your environment.

Once logstash is downloaded, extract the zip file and need to configure the logstash.

In order to do that create logstash-config.conf file in

<logstash-home>/config/logstash-config.conf

Now we configure logstash-config.conf file to receive http events as below.

# Sample Logstash configuration for creating a simple
# http -> Logstash -> Elasticsearch pipeline.
input {
http {
port => 8084 # default: 8080
}
}

filter {
if [headers][environment] {
mutate {
add_field => { "environment" => "%{[headers][environment]}" }
}
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
user => "${LocalElastic}"
password => "${LocalElasticPass}"
index => "services-%{+YYYY.MM}"
ssl => false
ssl_certificate_verification => false
cacert => "CA-certs\elastic-certificates-ca.pem"
}
stdout {}
}

With the http input plugin logstash will start a http server on port 8084 to listen to http events.

Then we have filter plugin and gork rule to do some manipulation over the event, as an example here it is added a new field called “environment” to the event and value of the environment filed is taken from the header (environment).

Finally output plugin part is configured as how http event is sent to elasticsearch.

Here it is configured two outputs one is for elasticsearch and other is for stdout.In the elasticsearch output section it is configured to connect the elasticsearch server that we have started.Hosts, user, password should be added according to the elasticsearch server details that we are going to send the events.

Index is the document name where elasticsearch is saved this http event json under.

In order to start the logstash server with logstash-config.conf follow the below command.

sh bin/logstash -f logstash-config.conf

We can configure pipelines.yml file also to include config file but here I’m just directly pointing the conf file.

Kibana

Kibana download page provide all the details for how to download and install Kibana depend on your environment.

We can access kibana in browser using url http://localhost:5601

Then in order to observe the events in Kibana we need to first create an index pattern for the index we configured in logstash config.

Go to left menu -> Stack management -> Create Index Pattern -> create index pattern as services-* since we have configured index in the logstash elasticsearch output config index as services-%{+YYYY.MM} .

Now go to the left menu -> Discover -> select the created index pattern from the drop down.

Then we can see the events once events are published as below image.

Finally as a summary in this article I have shown how simply setup and confiure ELK to accept http events and observe them in Kibana. We can further create Visualizations and Dashboards using these events and fields. Thank you very much.

--

--

Ujitha Iroshan

Developer, Integration Consultant, Microservice enthusiast. Ex WSO2