Search Tutorials


File Beat + ELK(Elastic, Logstash and Kibana) Stack to index logs to Elasticsearch - Hello World Example| JavaInUse



File Beat + ELK(Elastic, Logstash and Kibana) Stack to index logs to Elasticsearch - Hello World Example

In a previous tutorial we saw how to use ELK stack for Spring Boot logs. Here Logstash was reading log files using the logstash filereader.
You can make use of the Online Grok Pattern Generator Tool for creating, testing and dubugging grok patterns required for logstash. Suppose we have to read data from multiple server log files and index it to elasticsearch. One option is to install logstash on all the servers and then index it to the elasticsearch server.
multiple-logstash example
Logstash consumes a lot of resources so it is not an optimum solution to have logstash installed on all fileservers. Instead we can use Beats in such scenarios.
beats example
Beats are lightweight data shippers that we install as agents on servers to send specific types of operational data to Logstash. We will then filebeat to multiple servers, these will then read the log files and send it to logstash. Now we will use Filebeat to read the log file and send it to logstash to index it to elasticsearch.
beats-logstash example

Video

This tutorial is explained in the below Youtube Video.

  • Elasticsearch -
    • Download the latest version of elasticsearch from Elasticsearch downloads
      elasticsearch example
    • Run the elasticsearch.bat using the command prompt. Elasticsearch can then be accessed at localhost:9200
  • Kibana -
    • Download the latest version of kibana from Kibana downloads
      kibana example
    • Modify the kibana.yml to point to the elasticsearch instance. In our case this will be 9200. So uncomment the following line in kibana.yml-
      elasticsearch.url: "http://localhost:9200"
      
    • Run the kibana.bat using the command prompt. kibana UI can then be accessed at localhost:5601
  • Logstash -
    • Download the latest version of logstash from Logstash downloads
      logstash example
    • When using the ELK stack we are ingesting the data to elasticsearch, the data is initially unstructured. We first need to break the data into structured format and then ingest it to elasticsearch. Such data can then be later used for analysis. This data manipualation of unstructured data to structured is done by Logstash. Logstash itself makes use of grok filter to achieve this.
      Similar to how we did in the Spring Boot + ELK tutorial, create a configuration file named logstash.conf. Here Logstash is configured to listen for incoming Beats connections on port 5044. Also on getting some input, Logstash will filter the input and index it to elasticsearch.
      You can make use of the Online Grok Pattern Generator Tool for creating, testing and dubugging grok patterns required for logstash.
      # Read input from filebeat by listening to port 5044 on which filebeat will send the data
      input {
          beats {
      	    type => "test"
              port => "5044"
          }
      }
       
      filter {
        #If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
        if [message] =~ "\tat" {
          grok {
            match => ["message", "^(\tat)"]
            add_tag => ["stacktrace"]
          }
        }
       
      }
       
      output {
         
        stdout {
          codec => rubydebug
        }
       
        # Sending properly parsed log events to elasticsearch
        elasticsearch {
          hosts => ["localhost:9200"]
        }
      }
      	
      

      logstash listening to filebeat
    • Start logstash as follows-
      	logstash.bat -f logstash.conf
      		

      start logstash
  • FileBeat-
    • Download filebeat from FileBeat Download
      filebeat download example
    • Unzip the contents. Open filebeat.yml and add the following content. We are specifying the logs location for the filebeat to read from. The hosts specifies the Logstash server and the port on which Logstash is configured to listen for incoming Beats connections.
      filebeat:
        prospectors:
          -
            paths:
              - C:/elk/*.log
            input_type: log
            multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
            multiline.negate: true
            multiline.match: after
            
      output:
        logstash:
          hosts: ["localhost:5044"]	
      
    • Start filebeat as follows-
      	filebeat.exe -c filebeat.yml
      	

      filebeat example
Next copy the log file to the C:/elk folder.
filebeat folder example
Filebeat will start harvesting it and sending it to logstash which will in turn index it to elasticsearch.

filebeat logs indexed to elasticsearch

See Also

Spring Boot Hello World Application- Create simple controller and jsp view using Maven Spring Boot Tutorial-Spring Data JPA Spring Boot + Simple Security Configuration Pagination using Spring Boot Simple Example Spring Boot + ActiveMQ Hello world Example Spring Boot + Swagger Example Hello World Example Spring Boot + Swagger- Understanding the various Swagger Annotations Spring Boot Main Menu Spring Boot Interview Questions