EFK is an abbreviated form of Elasticsearch, Fluentd, and Kibana. It's a log-collecting, indexing and visualization tool. Widely used in the community with a plethora of support, it is the perfect solution for your logging needs.
In this blog, we'll learn to set up the latest version of EFK with the necessary security measures enabled. After setup, we'll create log views for our example Docker container.
Prerequisites
- Linux server with Docker pre-installed
- 6 GB RAM (at least)
Let's get started!
Setup EFK Stack
Let's start off by setting up the Docker compose files for it.
Create the parent directory using mkdir efk
. Save the contents of the codeblock below into a file with the name compose.yml
.
Now create three separate directories. We will store config files for the different components that comprise the EFK
stack in them.
mkdir -p {es,fluentd,kibana}
Create Elasticsearch
config
In the es
directory, create a file called elasticsearch.yml
and save the following contents into it.
Create Kibana
configuration
In the Kibana directory, create a file kibana.yml
,
Create the Fluentd
Docker container
Create the following files Dockerfile
, entrypoint.sh
, fluentd.conf
and copy the contents into the files respectively.
Starting the Stack
Once all the config files and container images are ready, we can start the stack with the command,
docker compose up -d
Now that the stack is up, the Fluentd and Kibana containers will display some errors related to authentication with the elasticsearch
container. That's because our elasticsearch
container is configured to allow clients access only when they perform a username & password-based authentication. To fix these errors, we'll do the following:
1. SSH into elastic
container
Get the container ID of elasticsearch
using,
docker ps
and then SSH into it,
docker exec -it <elastic_container_id>
2. Generate passwords
bash bin/elasticsearch-setup-passwords auto
This command is provided by default in Elasticsearch version 8
and can be used to generate authentication credentials. Running the command provides a set of credentials for different clients. Copy and store the credentials in a safe place.
3. Add credentials in the Fluentd and Kibana config files
In the fluentd.yml
and kibana.yml
files add username & password in the following way:
a) For fluentd.yml
b) For kibana.yml
Now that our passwords are configured, let's restart the stack.
The previous configuration will be restored upon restart. However, if you delete the Docker volume, the configuration will be deleted along with the volume.
Finally, we should be able to log in to the Elastic dashboard using the generated credentials. Also, Fluentd and Kibana should be working now.
Logging
Now let's check if logs are being forwarded to Fluentd. Let's run an example container and push logs to it. For this, we will use the Docker driver to push logs.
Step 1: Run a container with logging enabled
Run a Docker container with Fluentd logging driver enabled using the command below:
docker run -p 80:80 -itd --name nginx_container4 --log-driver=fluentd --log-opt fluentd-address=fluentd-host:24224 nginx:alpine
Step 2: Generate some noise
Use the curl command to send some requests to the container and generate noise in the logs.
curl localhost:80
Step 3: View the Logs
Now open the Kibana dashboard at http://localhost:5601
to set up a log view.
Click the hamburger menu on the left, and in the Analytics section, select the Discover subsection.
Here, we'll now create a log view.
Click on the leftmost button highlighted in blue in the dashboard and select the Create a data view
button.
Create a data view
in cases where EFK is newly set up.Give a name to the data view, and in the pattern section, write the following text fluentd-*
. Other fields can be left as is.
Now save the data view to Kibana. In my case, I had existing indices which is why many are on the list. But in your case, there will only be one with the current date i.e. fluentd-YYYY-MM-DD
.
After saving the data view, we will be greeted with a view like this:
If you click on the filter tab, many options will be available. We will use the container_name
filter to filter out the container that we recently used and view its logs.
Use this to filter out logs from the Nginx container only,
container_name : "/nginx_container4"
Now, we should see the logs generated by the Nginx Docker container in the Kibana dashboard.
This is just a simple use case of the EFK stack; it can be utilized further to build more informative and complex dashboards.
Conclusion
In this blog, we learned to set up an EFK cluster, log Docker container requests using the Fluentd Docker logging driver and finally visualized the logs in our Kibana dashboard.
Subscribe and leave a comment below if you have any queries. I try to periodically update my articles to ensure legibility.