Search Tutorials

Spring Boot Microservices + ELK(Elasticsearch, Logstash, and Kibana) Stack Hello World Example| JavaInUse

Spring Boot Microservices + ELK(Elasticsearch, Logstash, and Kibana) Stack Hello World Example

In this tutorial we will be using ELK stack along with Spring Boot Microservice for analyzing the generated logs. In next tutorial we will see how use FileBeat along with the ELK stack.
You can make use of the Online Grok Pattern Generator Tool for creating, testing and dubugging grok patterns required for logstash.
The implementation architecture will be as follows-
Spring Boot microservice elk stack example

What is ELK? Need for it?

The ELK Stack consists of three open-source products - Elasticsearch, Logstash, and Kibana from Elastic.
  • Elasticsearch is a NoSQL database that is based on the Lucene search engine.
  • Logstash is a log pipeline tool that accepts inputs from various sources, executes different transformations, and exports the data to various targets. It is a dynamic data collection pipeline with an extensible plugin ecosystem and strong Elasticsearch synergy
  • Kibana is a visualization UI layer that works on top of Elasticsearch.
These three projects are used together for log analysis in various environments. So Logstash collects and parses logs, Elastic search indexes and store this information while Kibana provides a UI layer that provide actionable insights.
Use Cases-
  • Consider you have a single application running and it produces logs. Now suppose you want analyze the logs generated. One option is to manually analyze them. But suppose these logs are large, then manually analyzing them is not feasible.
  • Suppose we have multiple Application running and all these applications produce logs. If we have to analyze the logs manually we will need to go through all the log files. These may run into hundreds.
We can use ELK here to analyze the logs more efficiently and also using more complex search criterias. It provides log aggregation and efficient searching.


This tutorial is explained in the below Youtube Video.

Lets Begin

We will first download the required stack.
  • Elasticsearch -
    • Download the latest version of elasticsearch from Elasticsearch downloads
      elasticsearch example
    • Run the elasticsearch.bat using the command prompt. Elasticsearch can then be accessed at localhost:9200
      elastic start example
  • Kibana -
    • Download the latest version of kibana from Kibana downloads
      kibana example
    • Modify the kibana.yml to point to the elasticsearch instance. In our case this will be 9200. So uncomment the following line in kibana.yml-
      elasticsearch.url: "http://localhost:9200"
    • Run the kibana.bat using the command prompt. kibana UI can then be accessed at localhost:5601
      start kibana
  • Logstash -
    • Download the latest version of logstash from Logstash downloads
      logstash example
    • Create a configuration file named logstash.conf. In further section we will be making the changes for this file and starting logstash.

Lets now come to the spring boot part. We will be creating a simple spring boot application, to generate the logs.
spring boot elasticsearch example
Define the pom.xml as follows-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="" xmlns:xsi=""



		<relativePath /> <!-- lookup parent from repository -->






Create the Spring Boot Bootstrap class as follows-
package com.javainuse;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

public class HelloWorldSpringBootApplication {

	public static void main(String[] args) {, args);
Next define the controller to expose REST API. We will be making use of these calls to write content to the log file.
package com.javainuse;

import java.util.Date;

import org.apache.log4j.Level;
import org.apache.log4j.Logger;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.client.RestTemplate;

class ELKController {
	private static final Logger LOG = Logger.getLogger(ELKController.class.getName());

	RestTemplate restTemplete;

	RestTemplate restTemplate() {
		return new RestTemplate();

	@RequestMapping(value = "/elk")
	public String helloWorld() {
		String response = "Welcome to JavaInUse" + new Date();
		LOG.log(Level.INFO, response);

		return response;

	@RequestMapping(value = "/exception")
	public String exception() {
		String response = "";
		try {
			throw new Exception("Exception has occured....");
		} catch (Exception e) {

			StringWriter sw = new StringWriter();
			PrintWriter pw = new PrintWriter(sw);
			String stackTrace = sw.toString();
			LOG.error("Exception - " + stackTrace);
			response = stackTrace;

		return response;

Finally specify the name and location of the log file to be created in the file.
Next we will configure the logstash pipeline. When using the ELK stack we are ingesting the data to elasticsearch, the data is initially unstructured. We first need to break the data into structured format and then ingest it to elasticsearch. Such data can then be later used for analysis. This data manipualation of unstructured data to structured is done by Logstash. Logstash itself makes use of grok filter to achieve this.
You can make use of the Online Grok Pattern Generator Tool for creating, testing and dubugging grok patterns required for logstash.
Logstash Pipeline
This is done using the logstash.conf-
input {
  file {
    type => "java"
    path => "C:/elk/spring-boot-elk.log"
    codec => multiline {
      pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*"
      negate => "true"
      what => "previous"
filter {
  #If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
  if [message] =~ "\tat" {
    grok {
      match => ["message", "^(\tat)"]
      add_tag => ["stacktrace"]
output {
  stdout {
    codec => rubydebug
  # Sending properly parsed log events to elasticsearch
  elasticsearch {
    hosts => ["localhost:9200"]
Start logstash using the command prompt as follows-
logstash -f logstash.conf

start logstash
Start the spring boot application by running the HelloWorldSpringBootApplication as a java application.
Logs will be generated in C:/elk folder.
  • goto localhost:8080/elk
    spring elk
  • goto localhost:8080/exception
    spring exception
  • go to kibana UI console- localhost and create an index pattern logstash-* to see the indexed data-
    kibana index

    kibana discover

Download Source Code

Download it -
Spring Boot Microservice+ ELK stack

See Also

Spring Boot Hello World Application- Create simple controller and jsp view using Maven Spring Boot Tutorial-Spring Data JPA Spring Boot + Simple Security Configuration Pagination using Spring Boot Simple Example Spring Boot + ActiveMQ Hello world Example Spring Boot + Swagger Example Hello World Example Spring Boot + Swagger- Understanding the various Swagger Annotations Spring Boot Main Menu Spring Boot Interview Questions