ELK Visualisation Guide¶
Dependencies¶
- Elasticsearch 8.0.1
- Kibana 8.0.1
- Filebeat 8.0.1
- Logstash 8.0.1
Install Dependencies¶
Goto Download Elastic Products and download the packages.
Architecture¶
All packages can run within the same machine or be splitted across machines. A demo architecture is shown below:
On one machine:
Configurations¶
Configure elasticsearch¶
File: ./config/elasticsearch.yml
Configure Filebeat¶
File: ./filebeat.yml
- Setup Inputs
filebeat.inputs:
- type: log
enabled: true
paths:
- <path to log file>
fields:
# A type/tag to be referenced in Logstash and Kibana
type: <a string>
fields_under_root: true
- Setup Kibana Host
# Specify Kibana IP and port
setup.kibana:
host: "localhost:5601"
- Comment Elasticsearch Output
# output.elasticsearch:
# Array of hosts to connect to.
# hosts: ["localhost:9200"]
# Protocol - either `http` (default) or `https`.
# protocol: "https"
# Authentication credentials - either API key or username/password.
# api_key: "id:api_key"
# username: "elastic"
# password: "changeme"
- Setup Logstash Host/s
# Specify Logstash IP and port
output.logstash:
hosts: ["localhost:5044"]
Configure Logstash¶
File: ./config/<filename>.conf
- Setup Inputs
input {
beats {
# if filebeat is on another machine from logstash, the host can be commented out.
host => "localhost"
port => 5044
}
}
- Setup Filters
filter {
# Remove the first few lines in the log file
if [message] =~ /^#/ {
drop { }
}
# tag of file from filebeat configuration.
if [type] in ["type_a","type_b"] {
csv {
columns => ["ts","counter","speed","temperature"]
# Delimiter (tab)
separator => " "
}
mutate {
convert => {
# data types for each column
"speed" => "float"
"temperature" => "float"
}
}
date {
# timestamp column
match => ["ts", "UNIX"]
}
}
}
- Setup Output
output {
elasticsearch {
hosts => ["localhost:9200"]
# index to reference in Kibana
index => <a string>
}
}
Configure Kibana¶
File: ./config/kibana.yml
- Set server host
server.host: "localhost"
# Or
# to allow Kibana server to be surf via its IP
server.host: “0.0.0.0”
Start Kibana Visualisation¶
Start the ELK stack to enable the pipeline, and start the visualisation.
- Start Elasticsearch
# Goto elasticsearch directory
# in a separate terminal
./bin/elasticsearch
- Start Kibana
# Goto kibana directory
# in a separate terminal
./bin/kibana
- Start Filebeat
# Goto filebeat directory
# in a separate terminal
./filebeat -c filebeat.yml --path.data data/registry/filebeat/
- Start Logstash
# Goto logstash directory
# in a separate terminal
sudo ./bin/logstash -f <the config file e.g logstash-sample.conf> --config.reload.automatic
- Start Kibana in a browser
Goto a browser
Search localhost:5601 or <kibana server IP>:5601
Customisations¶
Customise Kibana Canvas¶
Import the <saved-canvas>.ndjson file into Kibana Canvas.
Or
create a new canvas or dashboard.
Index Management¶
Goto Stack Management > Index management.
In this section, you can manage the indices, such as adding a lifecycle policy to manage the data stored.
Index Lifecycle Policy¶
Goto Stack Managament > Index Lifecycle Policy
In this section, you can add, edit, or delete index lifecycle policies.