ELK Visualisation Guide

Dependencies

  1. Elasticsearch 8.0.1
  2. Kibana 8.0.1
  3. Filebeat 8.0.1
  4. Logstash 8.0.1

Install Dependencies

Goto Download Elastic Products and download the packages.

Architecture

All packages can run within the same machine or be splitted across machines. A demo architecture is shown below:

On one machine:

elk_architecture

Configurations

Configure elasticsearch

File: ./config/elasticsearch.yml

Configure Filebeat

File: ./filebeat.yml

  1. Setup Inputs
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - <path to log file>
  fields:
    # A type/tag to be referenced in Logstash and Kibana
    type: <a string>  
 fields_under_root: true
  1. Setup Kibana Host
# Specify Kibana IP and port

setup.kibana:
host: "localhost:5601"  
  1. Comment Elasticsearch Output
# output.elasticsearch:
  # Array of hosts to connect to.
  # hosts: ["localhost:9200"]

  # Protocol - either `http` (default) or `https`.
  # protocol: "https"

  # Authentication credentials - either API key or username/password.
  # api_key: "id:api_key"
  # username: "elastic"
  # password: "changeme" 
  1. Setup Logstash Host/s
# Specify Logstash IP and port

output.logstash:
hosts: ["localhost:5044"]   

Configure Logstash

File: ./config/<filename>.conf

  1. Setup Inputs
input {
  beats {
  # if filebeat is on another machine from logstash, the host can be commented out.
    host => "localhost"   
    port => 5044
  }
}
  1. Setup Filters
filter {
  # Remove the first few lines in the log file
  if [message] =~ /^#/ {   
    drop { }
  }  
  # tag of file from filebeat configuration.
  if [type] in ["type_a","type_b"] {
    csv {
      columns => ["ts","counter","speed","temperature"]
      # Delimiter (tab)
      separator => "    "
    }
    mutate {
      convert => {
        # data types for each column
        "speed" => "float"
        "temperature" => "float"
      }
    }
    date {
      # timestamp column
      match => ["ts", "UNIX"]
    }
  }
}
  1. Setup Output
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    # index to reference in Kibana
    index => <a string>
  }
}

Configure Kibana

File: ./config/kibana.yml

  1. Set server host
server.host: "localhost"

# Or 
# to allow Kibana server to be surf via its IP
server.host: “0.0.0.0” 

Start Kibana Visualisation

Start the ELK stack to enable the pipeline, and start the visualisation.

  1. Start Elasticsearch
# Goto elasticsearch directory
# in a separate terminal
./bin/elasticsearch
  1. Start Kibana
# Goto kibana directory
# in a separate terminal
./bin/kibana
  1. Start Filebeat
# Goto filebeat directory
# in a separate terminal
./filebeat -c filebeat.yml --path.data data/registry/filebeat/
  1. Start Logstash
# Goto logstash directory
# in a separate terminal
sudo ./bin/logstash -f <the config file e.g logstash-sample.conf> --config.reload.automatic
  1. Start Kibana in a browser
Goto a browser
Search localhost:5601 or <kibana server IP>:5601

Customisations

Customise Kibana Canvas

Import the <saved-canvas>.ndjson file into Kibana Canvas.

Or 

create a new canvas or dashboard.

Index Management

Goto Stack Management > Index management.

In this section, you can manage the indices, such as adding a lifecycle policy to manage the data stored.

Index Lifecycle Policy

Goto Stack Managament > Index Lifecycle Policy

In this section, you can add, edit, or delete index lifecycle policies.