Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Three Node cluster of ELK with X-pack

ELK stands for Elasticsearch, Logstash, and Kibana. The trio, which was once separate, joined together to give users the ability to run log analysis on top of open sourced software that everyone can run for free. X-Pack is an Elastic Stack extension that bundles security, alerting, monitoring, reporting, and graph capabilities into one easy-to-install package, though ELK is an Open source, X-pack provides you 30-day trial license that allows access to all features.

The Architecture of the 3 Nodes ELK cluster with X-pack would be like this,

Fig.1 Three Node ELK cluster

In the proposed architecture,

  1. Multiple machines are configured with Filebeat to send data logs to Logstash (Node-2).
  2. After receiving multiple logs from Filebeat configured machines, Logstash will enrich that data by defined filters and format and send it to Elasticsearch Node for data storage and indexing purpose (Node-3).
  3. Then Elasticsearch will send data to Kibana (Node-1) to display those logs on screen.

Cluster working in Detail

1. Cluster Creation

To create ELK cluster, we will install Elasticsearch on all 3 nodes for data resilience meaning even 1 Node fails we have other 2 Nodes to keep our data safe and highly available. We will install Kibana on Node-1 along with Elasticsearch and Logstash on Node-2 with Elasticsearch. As we are keeping Elasticsearch alone on Node-3 that would be used for Data store and indexing. As Elasticsearch present on all servers, it will communicate and replicate data with each other automatically providing us a fault tolerant environment. To extend it further we can mention a number of Indexes and shards.

2. Communication between Filebeat and Logstash

After setting up ELK cluster, we are ready to pass some logs from Filebeat servers to Logstash. There are two output options present in Filebeat to send logs, first is “Elasticsearch” and second is “Logstash”; we will be sending our logs to Logstash.

To send logs to Logstash, Comment Elasticsearch output section and uncomment Logstash output section and add IP address of Logstash node (Node-2)

3. Communication between Logstash and Elasticsearch (Node-3)

After receiving logs from Filebeat, Logstash will enrich that data and add filters to make data more readable and easy to segregate then it will send those logs to Elasticsearch (node-3) for storing and indexing purpose

Note: To store data we are selecting Elasticsearch on Node-3 because on that node, only Elasticsearch is present so it will provide us a faster output for our queries and replicate data rapidly across cluster

4. Communication between Elasticsearch (Node-3) and Kibana

After performing steps, we will configure Kibana to get data from Node-3 by adding its IP address in kibana.yml file. Once it is done and if everything is okay we shall see kibana dashboard on 5601>



This post first appeared on Migrating XEN Virtual Machines To The AWS Cloud, please read the originial post: here

Share the post

Three Node cluster of ELK with X-pack

×

Subscribe to Migrating Xen Virtual Machines To The Aws Cloud

Get updates delivered right to your inbox!

Thank you for your subscription

×