MuleSoft + ELK

ANUPAM GOGOI
4 min readNov 13, 2022

--

Introduction

In my previous post, I explained how to send logs from Mule applications to Splunk.

Now, in this article, I will explore how to send logs from a Mule application to ELK.

Article Navigation

I will divide the article into two parts.

ELK installation

Configure Mule applications with ELK

If you are familiar with ELK you can happily skip the first part.

ELK Installation

I will go with the most basic installation of the ELK stack in my system using Docker. For the sake of simplicity, I have created a Docker Compose to include Elasticsearch & Kibana. We can skip the Logstash part as we are not going to use it in this demo.

Below is the simplest Docker Compose file.

Execute the below command and your EK stack is ready on your machine.

docker-compose up -d

Make some notes of the ports:

9200: Elasticsearch container port
5601: Kibana container port

ELK Navigation

Let's play for a while with the beautiful Kibana interface. Browse to the endpoint http://localhost:5601/ and you will land on the below page.

ELK

You don't need any login/password as we have not configured any for this basic setup.

Testing the ELK

Now, let's send some dummy data to check if it's working.

curl — location — request POST ‘http://localhost:9200/customer/_doc' \
— header ‘Content-Type: application/json’ \
— data-raw ‘{
“firstname”: “Anupam”,
“lastname”: “Gogoi”
}’

The above HTTP request should create an index called customer automatically and insert the mock payload. Let's verify it.

Go to the Stack Management section in the GUI.

Home → Management → Stack Management

Stack Management

Then click the Index Management section.

Index Management

You can see the index customer created automatically for us. How cool!

Create a data view

Now, let's visualize the data sent to the index. From the Home menu, go to

Management → Stack Magagement → Kibana → Data Views

Data Views

Then create a view and select the index whose data you want to view.

Create View

Give it a name and choose the index pattern you want to visualize in the data view.

Data View

Save and you are done.

Visualize data

Go to Home → Analytics → Discover

Discover

Select the data view created and you will be able to view your data.

Configure Mule applications with ELK

This is the simplest part of all. You don't need to do anything except add the below configuration in the log4j2.xml.

Note that now we are sending logs from the mule application to the index mule-app-ag-mock .

Normally, you should put the name of your application as a suggestion to localize your application logs easily.

Now, run the Mule application from anywhere that has access to the ELK server.

Next Steps

I will let these next steps to be done by you by following the first part of the article.

Check if the index mule-app-ag-mock was created.

Create a Data View for the index mule-app-ag-mock.

Below is the final result.

You can see that the Mule application sent the data to the index mule-app-ag-mock and we are viewing them in the data view named mule-app-ag-mock (I created)

You can add your own custom Elasticsearch expressions to query data. I will not delve into this as it's out of the scope of this article.

Conclusion

In this article, I have demonstrated how quickly an ELK stack can be configured and a Mule application can send logs to it. In the next articles, I will bring some more insights into the logging aspect.

Until then, happy learning & sharing.

--

--