The Ultimate Guide to Logging in Nodejs: How to Get the Most Out of Your Logs.

The Ultimate Guide to Logging in Nodejs: How to Get the Most Out of Your Logs.

What you will learn in this article:

  • Importance of logging in software systems,

  • Applications of logging,

  • Logging the request of a server in a log file

  • Processing a log file using “grep”

  • Using hoppscotch.io for API testing

Introduction

Why would someone write so much about logs? It turns out that the humble log is an abstraction that is at the heart of a diverse set of systems, from NoSQL databases to cryptocurrencies. Yet other than perhaps occasionally tailing a log file, most engineers don’t think much about logs.

So what on earth is logging? 😪😪

As you know, Backend development is an essential component of any software system, and logging is one of the most important aspects of successful backend development. Logging is the process of tracking and recording the activity of software systems, and it helps developers identify and troubleshoot any issues that arise. Logging also helps developers understand how their applications are being used, giving them valuable insights into their users' behavior. In this article, we'll look at the importance of logging in backend development, the use cases of logging, and how it can help developers build better applications.

Applications of Logging:

Logs are very essential in a software system as it plays an integral part during all stages of the application from development, testing, staging, and also the Production Environment. Some of the general applications of logs can be found in the following

1. Identifying errors within code: Logging helps identify errors within code by providing a detailed record of the program’s execution. Logging data can be used to help pinpoint where errors have occurred and identify the underlying causes.

2. Tracking requests for debugging: Logging can also be used to track changes in behavior over time and detect anomalies in the program’s execution. This can help identify scenarios where the program is not behaving as expected, which can then be investigated further to locate the source of the error.

3. Identifying user activity: Logging can help to identify user activity by providing a record of user interactions with a system. This record can help to track user behavior and identify suspicious activity such as unauthorized access, data manipulation, and malicious activity. Logging user activity can also help to audit user behavior and detect potential security threats.

4. Monitoring infrastructure: Observability systems like Datadog, make use of logs under the hood. Logging can provide a comprehensive view of the status, performance, and health of the infrastructure. This view can be used to identify issues and problems before they become major issues and affect infrastructure performance. It can also assist in tracking down the root cause of any issue that has already occurred. Logging can therefore be used as part of a comprehensive monitoring system to enable proactive action to prevent and resolve any infrastructure problems.

5. Improving customer service: Customers are king for any business to flourish. Logging customer service interactions can be a powerful tool for improving customer service by allowing businesses to accurately track and analyze customer service data. This data can provide valuable insights into customer needs, preferences, and behaviors. Logging allows trends to be identified, such as common customer complaints, and addressed, thereby improving customer service.

Logging also enables businesses to track customer satisfaction levels, enabling them to make changes that are likely to increase customer satisfaction. Additionally, logging customer service interactions allows companies to better understand agents' performance and identify areas where additional training may be necessary.

It’s Build O’clock🐱‍👤🐱‍👤

We will be building an express application for a fictional SaaS startup where we will be using the logs to rank the complaints of customers and also their satisfaction levels.

The Function requirement of our application will be as follows:

- Customers can send in complaints with a subject, some description, and a category

- Each complaint will have a Ticket ID and a Status (Closed, Open )

- A customer service agent can update the ticket status by getting the ID of the ticket

We need to use the logs to check the frequency of complaints and the number of closed tickets.

Technically, we will need just two API endpoints, one for submitting a complaint, and the other will be for updating the status of the complaint. but we will be adding one more endpoint, to get the list of complaints in our data store.

- GET /complaints - POST /complaints body={subject, description, category} - PUT /complaints/:id

To get started, we will create an empty GitHub repository and create our express application, for persisting the complaints we will be using a data structure instead of a database.

Also, we will be using morgan as a logger middleware for our HTTP request.

In order not to waste much time setting up our local environment, we need to open our repository on gitpod, by appending the repository URL with, gitpod.Io/#, It will create a workspace where all the pre-requisite is installed. Now, we will initialize a project and install the dependencies:

npm init -y && npm I express nodemon morgan

Then we create a file, app.js in the root folder. In the app.Js file, we will create a basic express app with 3 endpoints as stated above

const express = require("express");

const app = express()
const PORT = process.env.PORT || 8000


app.use(express.json())

const complaints = []
app.post("/complaints", (req, res) => {
    const { subject, description, catgory } = req.body;
    const newComplaints = {
        id: complaints.length + 1,
        subject,
        description,
        catgory,
        status: "open"
    }
    complaints.push(newComplaints)
    res.status(200).send({
        message: "Successful"
    })
})

app.put("/complaints/:id", (req, res) => {
    const { id } = req.params
    const complaint = complaints.filter(complain => complain.id === Number(id))
    complaint[0].status = "Closed"

    res.status(201).json({
        message: `Ticket with id: ${id} have been closed`
    })


})

app.get("/complaints", (_, res) => {
    res.send({
        data: complaints
    })
})


app.listen(PORT, () => {
    console.log("App is running fine")
})

Next, we make use of morgan middleware by doing just two things, First, we create the file where the HTTP logs will be stored, Secondly, we inform our express app of what type of logs we want, and also where it should store the logs.

/*
other parts remains the same
*/
const logger = require("morgan")
const fs = require("fs")
const path = require("path");



const logStream = fs.createWriteStream(path.join(__dirname, "access.log"))

app.use(logger("common", { stream: logStream }))
app.use(express.json())

/*
remining parts remains the same
*/

Now to start our application, we will update the scripts section of package.json, with the following command in the terminal,

start: nodemon app.Js

Then we run the application in the terminal with the command, Npm start

Now, we should see the file, access.Log created in the root folder as shown below,

Next, we need to get our server URL, and then test our endpoints. But gitpod will create a URL, which will be used to access the workspace. To get the URL, at the bottom of your browser tab, you should see something like the image below, click on the Ports, then copy the URL.

For testing our endpoints, we will be using Hoppscotch • Open source API development ecosystem • Hoppscotch,

Now, on hopscotch, we paste the URL and use some dummy data to populate the datastore. When we send the request it will not be successful, because we have not chosen an interceptor. Chose the proxy option.

Now, we should be having some results, like the one below,

Now, we send a POST request three times to populate our database, then try to GET all the complaints in our datastore, then update the status of the first complaint with the PUT request. Let’s send the POST request with the following data, as shown below, Note the red circle here

Then to Get the list of complaints, we change the HTTP method to GET,

We should have something like the image below.

Then for our update operation, we change the HTTP request to PUT, Note the circle in the image below,

Now we need to check if this update was successful, by getting all complaints one more time. Now we see the status, has indeed been updated.

Log Processing with Grep

We have run some HTTP requests, and our access.log file has been updated if you check. Going through a log file manually is strenuous, so we will be using a Unix command to check the number of ;

- POST request, which signifies the frequency of complaints received

- PUT request, which signifies the frequency of complaint tickets closed. Open a new terminal in your workspace, and run the following command,

grep "POST /complaints/" ./access.log -c

This command checks the access.log file for words that match, POST /complaints, then returns the count. For me, the output looks like this,

Now let’s get the number of PUT requests also,

grep "PUT /complaints/" ./access.log -c

We see that we have 1 ticket closed out of 3 tickets.

Conclusion

In this article, we have been able to go through the logging landscape, built a web server, and also processed our logs using grep.

In conclusion, logging in Saas products is an important part of the user experience. It facilitates the secure storage of data, simplifies user authentication, and allows companies to track usage and activity. It also helps to ensure that any application remains safe and that user data remains confidential. As the demand for saas products continues to grow, it's essential to ensure that logging procedures are secure, efficient, and user-friendly. With the right logging solutions in place, businesses can ensure that their saas products remain safe and usable for their customers.

For more content like this, you can reach out to me on Twitter and Linkedin