Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2

Introduction

This article is the last part of a two part series where we will deploy ELK stack using docker/docker-compose.
In this article, we will be configuring Logstash, Elasticsearch and Kibana. If you haven’t gone through the previous …


This content originally appeared on DEV Community and was authored by DEV Community

Introduction

This article is the last part of a two part series where we will deploy ELK stack using docker/docker-compose.
In this article, we will be configuring Logstash, Elasticsearch and Kibana. If you haven't gone through the previous article, check out this link

Logstash

Logstash is a server-side data processing pipeline that consumes data from different sources and send it to elasticsearch. We touched on its importance when comparing with filebeat in the previous article.
Now to install logstash, we will be adding three components

  • a pipeline config - logstash.conf
  • a setting config - logstash.yml
  • docker-compose file

Pipeline configuration will include the information about your input (kafka in our case), any filteration that needs to be done, and output (aka elasticsearch).
Create a folder named pipeline and add this configuration file to it.

Logstash pipeline - logstash.conf

input {
    kafka{
        bootstrap_servers => "KAFKA_SERVERS_IP:KAFKA_SERVERS_PORT"
        topics => ["applogs"]
    }
}
filter {
    json {
        source => "message"
      }
}
## Add your filters / logstash plugins configuration here
output {
        elasticsearch {
    hosts => ["ELASTICSEARCH_IP:9200"]
    user => 'elastic'
    password => 'somesecretpassword'
    }

As you can see, In the input section, we are listening to kafka on topic applogs.
In my case, I have added a filter which will jsonify the data that comes under the key named "message". There are several plugin filters to choose from.
The output is redirected towards elastic search, with username password configured for authentication.
Create a folder named settings and add this configuration file to it

logstash settings - logstash.yml

http.host: "0.0.0.0"
path.config: /usr/share/logstash/pipeline
path.logs: /var/log/logstash
config.reload.automatic: true
log.level: debug
xpack.monitoring.enabled: false

The docker-compose file looks like this:

logstash docker-compose

version: '2'

services:
  logstash:
    image: docker.elastic.co/logstash/logstash:5.6.3
    ports:
      - "10000:10000"
    volumes:
      - ./settings/:/usr/share/logstash/config/
      - ./pipeline/:/usr/share/logstash/pipeline/
    container_name: logstash                          

Here the configuration files mentioned above have been mounted.

To run the above file

docker-compose up -d

Now, You will get an error as we have not setup elastic search yet. Now lets go ahead and install Elasticsearch and Kibana

Elasticsearch and Kibana

Kibana is an open source user interface that helps you monitor and visualise data, which in our case, is being provided by elasticsearch.
Elasticsearch, as the name suggests, is a search and analytics engine for different types of data.

Elasticsearch and Kibana docker-compose

version: "2"
services:
 elasticsearch:
  image: "docker.elastic.co/elasticsearch/elasticsearch:7.13.2"
  container_name: elasticsearch
  environment:
   - discovery.type=single-node
   - cluster.routing.allocation.disk.threshold_enabled=true
   - cluster.routing.allocation.disk.watermark.low=65%
   - cluster.routing.allocation.disk.watermark.high=70%
   - xpack.security.enabled=true
   - xpack.security.audit.enabled=true
   - ELASTIC_PASSWORD=somethingsecret
   - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
   - bootstrap.memory_lock=true

  ulimits:
    memlock:
      soft: -1
      hard: -1
  volumes:
    - ./data:/usr/share/elasticsearch/data
  ports:
   - "9200:9200"
  networks:
   - eknetwork

 kibana:
  depends_on:
   - elasticsearch
  image: "docker.elastic.co/kibana/kibana:7.13.2"
  container_name: kibana
  ports:
   - "5601:5601"
  environment:
   - ELASTICSEARCH_URL=http://localhost:9200
   - ELASTICSEARCH_USERNAME=elastic
   - ELASTICSEARCH_PASSWORD=somethingsecretpassword
  networks:
   - eknetwork

networks:
 eknetwork:
  driver: bridge

Make sure to add same elasticsearch password as the one you have provided in your logstash pipeline configuration file.

To run the above file

docker-compose up -d

Now that all our processes are running, You would need to configure Kibana's source from the UI so that you are able to see data on the UI. If you have done this setup on the server and want to configure UI immediately, I would suggest you to create a firewall rule for your public IP on port 5601

ufw allow from YOUR_PUBLIC_IP to any port 5601

Now go to your favorite browser, type server's ip and port and you will see something like this:

Home page of Kibana
Enter the credentials as specified in the docker-compose above and you should be able to enter the application.

And Voila! Your setup is complete. Just don't forget to add log-rotation to your docker instances.
I have added all the configuration and docker file here


This content originally appeared on DEV Community and was authored by DEV Community


Print Share Comment Cite Upload Translate Updates
APA

DEV Community | Sciencx (2021-07-18T11:49:32+00:00) Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2. Retrieved from https://www.scien.cx/2021/07/18/deploy-kafka-filebeat-elk-docker-edition-part-2/

MLA
" » Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2." DEV Community | Sciencx - Sunday July 18, 2021, https://www.scien.cx/2021/07/18/deploy-kafka-filebeat-elk-docker-edition-part-2/
HARVARD
DEV Community | Sciencx Sunday July 18, 2021 » Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2., viewed ,<https://www.scien.cx/2021/07/18/deploy-kafka-filebeat-elk-docker-edition-part-2/>
VANCOUVER
DEV Community | Sciencx - » Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2021/07/18/deploy-kafka-filebeat-elk-docker-edition-part-2/
CHICAGO
" » Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2." DEV Community | Sciencx - Accessed . https://www.scien.cx/2021/07/18/deploy-kafka-filebeat-elk-docker-edition-part-2/
IEEE
" » Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2." DEV Community | Sciencx [Online]. Available: https://www.scien.cx/2021/07/18/deploy-kafka-filebeat-elk-docker-edition-part-2/. [Accessed: ]
rf:citation
» Deploy Kafka + Filebeat + ELK – Docker Edition – Part 2 | DEV Community | Sciencx | https://www.scien.cx/2021/07/18/deploy-kafka-filebeat-elk-docker-edition-part-2/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.