In this article, I'm going to show you how to create a dashboard in Kibana to visualize application logs, and of course if you are using Elasticsearch to store your application logs.
Step 1- Setup Elasticsearch and Kibana
I use docker to run an instance of Elastic and Kibana.
- Create
docker-compose.yml
file and add the following content:
version: "3.0"
services:
elasticsearch:
container_name: es-container
image: docker.elastic.co/elasticsearch/elasticsearch:7.12.0
environment:
- xpack.security.enabled=false
- "discovery.type=single-node"
networks:
- es-net
ports:
- 9200:9200
kibana:
container_name: kb-container
image: docker.elastic.co/kibana/kibana:7.12.0
environment:
- ELASTICSEARCH_HOSTS=http://es-container:9200
networks:
- es-net
depends_on:
- elasticsearch
ports:
- 5601:5601
networks:
es-net:
driver: bridge
- Run the
docker-compose up
command Now you should have access to Elasticsearch viahttp://localhost:9200/
and Kibana throughhttp://localhost:5601/
Step 2- Setup ASP.NET Core Web API project
This step is optional if you are not a .NET developer, just create an application store some logs into Elasticsearch.
- Open Visual Studio and create a Web API project
- Install Serilog.AspNetCore and Serilog.Sinks.Elasticsearch nuget packages
- Open
Program.cs
file and modifyCreateHostBuilder
method and AddUseSerilog
extension method:
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
})
.UseSerilog((hostingContext, loggerConfiguration) =>
loggerConfiguration.ReadFrom.Configuration(hostingContext.Configuration));
- Open
appsettings.json
file and get rid of the logging section:
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
}
- Add Serilog configuration section to
appsettings.json
file:
"Serilog": {
"MinimumLevel": {
"Default": "Debug",
"Override": {
"Microsoft": "Information",
"System": "Information"
}
},
"Enrich": [ "FromLogContext" ],
"WriteTo": [
{
"Name": "Elasticsearch",
"Args": {
"nodeUris": "http://localhost:9200",
"indexFormat": "demo-api-{0:yyyy.MM}",
"autoRegisterTemplate": true,
"autoRegisterTemplateVersion": "ESv7"
}
}
]
}
- Open
Startup
class and in theConfigure
method add the following code:
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
....
app.UseSerilogRequestLogging();
- Run the API project and make several API call through Swagger
Step 3- Create index in Kibana
In Kibana dashboard from the menu go to
Management -> Stack Management
In
Stack Management
page and in Kibana section click onIndex Patterns
In Create index pattern page you should see your index pattern
demo-api-2021.04
name (if your logs are saved successfully in Elasticsearch) and inIndex pattern name
input enter your index namedemo-api-*
and clickNext
button
In the next step in the
Time field
drop down select@timestamp
and click onCreate index pattern
button
-
From the menu go to
Analytics -> Discover
page and select your index to view logs
Step 4- Create Visualizer
Before creating a dashboard you need to create several visualizers to add your log dashboard.
From the menu click on
Analytics -> Visualize library
then click onCreate new visualization
From
New visualization
popup window click onAggregation based
(easiest way to create visualizer is usingLens
but after creating visualizer aggregation based method, you can create other visualizers very quickly)
Total logs metric visualizer
- Click on
Metric
visualizer - After choosing
Metric
visualizer then click on your indexdemo-api-*
- By now I have 100 logs
- You just need to save the visualizer, click on the save button name it
Total logs
Log level metric visualizer
- Follow the above steps and create another metric visualizer
- In
Buckets
section click onAdd
thenSplit group
- From
Aggregation
dropdown selectTerms
- After selecting
Terms
inField
dropdown selectlevel.raw
and then click theUpdate
button at the bottom Now you have all log levels count and save the visualizer and name itLog levels count
Log level pie visualizer
- Create a new visualizer and choose
Pie
- In
Buckets
section click onAdd
thenSplit slices
- In
Aggregation
dropdown selectTerms
- In
Field
dropdown selectlevel.raw
- Click the
Update
button at the bottom - Save the visualizer name it
Log levels percentage
You can click onOptions
and change the appearance of the pie You can also change the color of each level by clicking on the level label
Vertical bar visualizer
- Create a new visualizer and choose
Vertical bar
- In
Buckets
section click onAdd
thenX-axis
- In Aggregation dropdown select
Date Histogram
- Click the
Update
button at the bottom - Save the visualizer name it
Total logs bar
Table visualizer
- Create a new visualizer and choose
Table
- In
Buckets
section click onAdd
thenSplit rows
- In
Aggregation
dropdown selectTerms
- In
Field
dropdown selectfields.SourceContext.raw
- Click the
Update
button at the bottom - Add another
Split rows
- In
Aggregation
dropdown selectTerms
- In
Field
dropdown selectlevel.raw
- Save the visualizer name it
Log level sources
Error log level search
- From the Analytic menu click on Discover
- In KQL input enter
level.raw:"Error
- In Date filter choose
Today
- Save the search name it
Today error level search
It's time to create a dashboard.
- From the Analytic menu click on Dashboard
- Click on
Create new dashboard
button - From the right sidebar click all visualizer we have created previously
- From
Types
dropdown click onSaved search
then click onToday error level search
- Click on
Save
button and save the dashboard
In the end, you can drag and drop or resize visualizers and arrange visualizers as you like.
Top comments (3)
Just did and it's awesome! Thanks for sharing.
Very nice! Thanks again for sharing.
I'll try that at my work. :)
thanks mohsen . greate job