Log analyzing using ELK and Kafka - BEHIND JAVA

Log analyzing using ELK and Kafka

Share This

The ELK Stack is a collection of three open-source products — Elasticsearch, Logstash, and Kibana — from Elastic. Elasticsearch is a NoSQL database that is based on the Lucene search engine. Logstash is a log pipeline tool that accepts inputs from various sources, executes different transformations, and exports the data to various targets. Kibana is a visualization layer that works on top of Elasticsearch.

Together, these three different open source products are most commonly used in log analysis in IT environments (though there are many more use cases for the ELK Stack starting including business intelligence, security and compliance, and web analytics). Logstash collects and parses logs, and then Elasticsearch indexes and stores the information. Kibana then presents the data in visualizations that provide actionable insights into one’s environment.

ELK Stack Architecture

Logstash processes the application log files based on the filter criteria we set and sends those logs to Elasticsearch. Through Kibana, we view and analyze those logs when required.

ELK stack configuration

All these three tools are based on JVM and before start installing them, please verify that JDK has been properly configured. Check that standard JDK 1.8 installation, JAVA_HOME and PATH set up is already done.

Elasticsearch

  • Download latest version of Elasticsearch from this download page and unzip it any folder. Run bin\elasticsearch.bat from command prompt.
  • By default, it would start at http://localhost:9200

Kibana

  • Download the latest distribution from download page and unzip into any folder.
  • Open config/kibana.yml in an editor and set elasticsearch.url to point at your Elasticsearch instance. In our case as we will use the local instance just uncomment elasticsearch.url: "http://localhost:9200"
  • Run bin\kibana.bat from command prompt.
  • Once started successfully, Kibana will start on default port 5601 and Kibana UI will be available at http://localhost:5601

Logstash

  • Download the latest distribution from download page and unzip into any folder.
  • Create one file logstash.conf as per configuration instructions. We will again come to this point during actual demo time for exact configuration
  • .
  • Now run bin/logstash -f logstash.conf to start logstash

ELK stack example

Create Microservice

Let’s create an application using spring boot for faster development time. Follow those steps to start this service. Add one RestController class which will expose few endpoints like /elk, /elkdemo, /exception. Actually we are going to test few log statements only, so feel free to add/modify logs as per your choice.

Open application.properties under resources folder and add below configuration entries.

logging.file=elk-example.log
spring.application.name = elk-example

Do a final maven build using mvn clean install and start the application using command java -jar target\elk-example-spring-boot-0.0.1-SNAPSHOT.jar and test by browsing http://localhost:8080/elk.

Don’t be afraid by seeing the big stack trace in the screen as it has been done intentionally to see how ELK handles exception message.

Go to the application root directory and verify that the log file i.e. elk-example.log has been created and do a couple of visits to the endpoints and verify that logs are getting added in the log file.

Logstash Configuration

We need to create a logstash configuration file so that it listen to the log file and push log messages to elastic search. Here is the logstash configuration used in the example, please change the log path as per your setup.

Kibana Configuration

Before viewing the logs in Kibana, we need to configure the Index Patterns. We can configure logstash-* as default configuration. We can always change this index pattern in logstash side and configure in Kibana. For simplicity, we will work with default configuration.

The index pattern management page will look like below. With this configuration we are pointing Kibana to Elasticsearch index(s) of your choice. Logstash creates indices with the name pattern of logstash-YYYY.MM.DD We can do all those configuration in Kibana console http://localhost:5601/app/kibana and going to Management link in left panel.

Verify ELK Stack

Now when all components are up and running, let’s verify the whole ecosystem.

Go to application and test the end points couple of times so that logs got generated and then go to Kibana console and see that logs are properly stacked in the Kibana with lots of extra feature like we can filter, see different graphs etc in built.

Here is the view of generated logs in Kibana.

Using Kafka with ELK

First of all it should needs to stream log4j logs to Kafka.Following is the log4j configuration for the same that will push logs to Kafka.

Above configuration is used with spring boot application and once the spring boot application is started, it will start pushing the logs to kafka.

Consuming Kafka Message in Logstash

Before starting the logstash, we need to provide the configuration file to logstash. In this configuration file, we will have configuration related our Kafka.Following is the configuration that tells logstash about Kafka server address and the topic name from which it can consume the messages.

cd /Users/macuser/Documents/work/soft/analytics/logstash-6.2.2
./bin/logstash -f logstash-kafka.conf

Now, we have kafka, elasticsearch and Logstash is up and running and our application log is directly getting pushed to kafka and Logstash is reading from it pushing to elasticsearch. remaining flow we already discussed above

No comments:

Post a Comment

Pages