Logging using ELK for MuleSoft
- April 13, 2020
What is ELK?
ELK is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana
Elasticsearch is an open source, full-text search and analysis engine, based on the Apache Lucene search engine. Logstash is a server-side data processing pipeline that ingests data from multiple sources simultaneously, transforms it and then sends it to a “stash” like Elasticsearch. Kibana lets users visualize data with charts and graphs in Elasticsearch.
Why do we need a system like ELK?
- Log aggregation and efficient searching
- Generic search
There are three main reasons we need ELK:
- It’s interoperable
- It’s open source
- It’s managed
How to download ELK?
ELK and File Beat can be downloaded from the locations below:
Elastic Search: https://www.elastic.co/downloads/elasticsearch
Kibana: https://www.elastic.co/downloads/kibana
Logstash: https://www.elastic.co/downloads/logstash
Filebeat: https://www.elastic.co/downloads/beats/filebeat
ELK general architecture
In general, the ELK architecture looks as shown in the image:
- File Beat pools the file and sends the data to Logstash
- Logstash gets data filter, processes it and sends it to Elasticsearch
- Elasticsearch stores data in persistent store with Indexing
- Kibana can pull data on demand and create Graph/Chart/Reporting
If in an enterprise we have more than one server, this is what a typical ELK stack looks like:
As Logstash is heavy on resources, we can use Filebeat, which pushes the data to Logstash.
Lets integrate ELK with MuleSoft:
- Install ELK and FileBeat on you local system
- Start Elasticsearch
- Go to a browser and open http://localhost:9200 and if Elasticsearch is running fine you will get output like below.
- Start Kibana
- Open Kibana in browser (http://localhost:5601)
- Create a logstash configuration file as shown below
- Line # 5 specifies the port logstash will listen
- Line # 15 specifies the port Elasticsearch server where logstash forward the data
- Run log stash with configuration created earlier
- logstash.bat -f logstash-beat.conf
- Create a Filebeat configuration file as shown below
- Line # 5 specifies the log file to poll
- You can add more log file similar to line # 5 to poll using same Filebeat
- Line # 7 specifies the pattern of log file to identify the start of each log
- Line # 8 and 9 are required to each log span more than one line
- Run Filebeat with configuration created earlier
- lfilebeat.exe -c filebeat.yml
- Now go to Kibana (http://localhost:5601) -> Management -> Index pattern
- Click on Create Index Pattern
- You can see a new index filebeat-7.6.1-2020.03.30 is created. This Index is created because of line # 15 of the logstash configuration file. Select it and click on Next Step
- Click on dropdown and select @timestamp and click on Create Index Pattern
- Start Mule application for which log you have configured in filebeat configuration (line # 5)
- Run few cases so Mule file can generate the logs
- Go to Kibana (http://localhost:5601) -> Discover
- Select Index Pattern Create in previous step
- In Search you can write any suitable expression to search specific text from log file
Reference material
- Installing the Elastic Stack on Windows (https://logz.io/blog/elastic-stack-windows/)
- The Complete Guide to the Elk Stack (https://logz.io/learn/complete-guide-elk-stack/#installing-elk)
- File Beat + ELK (Elastic, Logstash and Kibana) Stack to index logs to Elasticsearch (https://www.javainuse.com/elasticsearch/filebeat-elk)
— By Mohammad Mazhar Ansari