DAY 2
For day 2 I was introduced to the ELK Stack.
I'll be using this stack for the duration of the challenge, but before I dive in lets get a better understanding of the ELK Stack
What is ELK Stack
E - Elastic search: database to store logs, uses and ES|QL, uses RESTFUL API & JSON to query data
L - Logstash: pipeline to collect telemetry from various sources to transforms it and sends it to your stash
K - Kibana: Web console to query our logs stored within our elastic search instance. Data visualization & Data Explorations & GEO Mapping
Benefits to ELK
- Centralized Logging meets compliance requirements & search data
- Flexibility Customized Ingestion
- Visualizations Observe information at a glance
- Scalability Easy to configure to handle larger environments
- Ecosystem Many integrations and rich community
Search Data, Create Visualization, Create Reports, Create Alerts are the focus of the ELK Stack.
After getting a rundown of what the ELK stack is I noticed I'm already familiar with 2 out the 3; those being Elastic search and Kibana.
Both of these tools I had access to working in my previous Incident management role, often querying logs for event instance and to paras through the data.
This will be the first time setting up Elastic search & Kibana on my own.
Day 3
Following we got into the terminal and downloaded Elastic Search and configured the firewall
Creating a Virtual Private Cloud
First I created a VPC 2.0 using Vultr, after I spun up a sever opted for an Ubuntu image to be hosted within the network group.
Setting up the server was very easy, similar to spinning up servers in AWS or Azure.
Refer back to DAY 1: Logical Diagram to see my intended setup
Installing Elastic Search
From Elastic.co I went to download section, copy the link for zip file.
The fun comes in when I ssh into the VM from the terminal to plug in the url of the download.
Root@ plubic IP Address
Once in the VM I ran cmd
wget - url link from Elastic search download
dpkg -i 'Elasticsearch package file name & version'
After installing Elastic search I noted the security configuration info.
This contained my super user password and steps on how to generate Kibana token, Elastic nodes, and update user password.
I copied all of this to a note pad for future reference.
Configure Elastic Search
Before I can get started using Elastic search I wanted to configure a few settings.
So I accessed the Elastic.yml file.
nano elasticsearch.yml
I updated the Network host and removed the comment on http port
The network host will point to the private IP I created in the VPC, this will allow my SOC computer to access my elastic search instance.
Firewall group
Lastly I configured my firewall to make sure it's tightened up.
I went back to Vultr, under network selected firewall.
I created a group for my server and by default the ssh source was set to Anywhere
I updated this to be from MyIP.
Now my VM has a Firewall group that is only accessible via myIP address.
Run Elastic search
Now let's run the Elastic search !
I head back to my terminal and ran the following cmds
sudo systemctl daemon-reload
sudo systemctl enable elasticsearch.service
sudo systemctl start elasticsearch.service
I wanted to checked that I successfully got it up and running.
sudo systemctl status elasticsearch.service
Take Aways
Day 2 & 3 were very informative on the ELK Stack and I enjoy playing around in the terminal. I often forget a few cmd so I keep a cheatsheet on hand.
Early in my tech career I would be frighten/ unsure about the terminal and even scared to run cmds; but now I enjoy being in there and getting messy.
Top comments (0)