ELK logger and alert system in Node.js
We can ensure the performance and availability of our application by analyzing the data generated by it. The data could be event logs or metrics which enables monitoring, identification and resolution of issues and this is where centralized log management and analytics solutions such as the ELK Stack come into the picture. ELK Stack comprises three open-source products — Elasticsearch, Logstash, and Kibana. Logstash extracts the logging data from different input sources, transforms it and later sends it to a stash like Elasticsearch. Kibana is a web interface, which accesses the logging data from Elasticsearch and visualizes it.
Elasticsearch: This is basically a NoSQL database, specialized in search. So it acts as the main storage for log events, making them easy to search and retrieve later on. Elasticsearch facilitates near real-time search and analytics for different types of data. It can efficiently store and index structured, unstructured, numerical or geospatial data in a way that supports quick searches. The steps of setting up and running Elasticsearch are:
- Download the latest version of elasticsearch.
- Run the elasticsearch.bat using the command prompt. Elasticsearch can then be accessed at localhost:9200
Logstash: Logstash can be considered as a log aggregator that collects data from various input sources, executes different transformations and ships the data to various supported output destinations. The steps for setting up and running Logstash are:
- Download and unzip Logstash
- Prepare a logstash.conf config file. Logstash configuration files are in the JSON-format and reside in ./logstash-7.9.3/bin. Logstash configuration involves three sections i.e. inputs, filters, and outputs. Let’s create a configuration file called logstash.conf and set up tcp input:
Read more: https://tudip.com/blog-post/elk-logger-and-alert-system-in-node-js/