How to Implement Logging in a Node.js Application With Pino-logger

Avatar of Sarthak Duggal
Sarthak Duggal on

Logging, on its own, is a key aspect of any application. Logging helps developers comprehend what it is that their code is doing. It also helps save developers hours of debugging work. This tutorial is about implementing logging in a Node.js application using Pino-logger.

With logging, you can store every bit of information about the flow of the application. With Pino as a dependency for a Node.js application, it becomes effortless to implement logging, and even storing these logs in a separate log file. And its 7.8K stars on GitHub are a testament to that.

In this guide:

  • You will study how to configure logging services with different logging levels.
  • You will learn how to prettify the logs in your terminal as well as whether or not to include the JSON response in your logs.
  • You will see how to save these logs in a separate log file.

When you’re done, you’ll be able to implement logging with coding-best practices in your Node.js application using Pino-logger.

Prerequisites

Before following this tutorial make sure you have:

  • Familiarity with using Express for a server.
  • Familiarity with setting up a REST API without any authentication.
  • An understanding of command-line tools or integrated terminals in code editors.

Downloading and installing a tool like Postman is recommended for testing API endpoints.

Step 1: Setting up the project

In this step, you set up a basic Node.js CRUD application using Express and Mongoose. You do this because it is better to implement logging functionality in a codebase that mimics a real-world application.

Since this article is about implementing the logger, you can follow “How To Perform CRUD Operations with Mongoose and MongoDB Atlas” to create your basic CRUD application in Node.js.

After completing that tutorial, you should be ready with a Node.js application that includes create, read, update, and delete routes.

Also, at this point. You can download nodemon so that each time you save changes in your codebase, the server automatically restarts and you don’t have to manually start it again with node server.js.

So, write this command in your terminal:

npm install -g --force nodemon

The -g flag depicts that the dependency is installed globally and, to perform something globally, you are adding the --force flag in the command.

Step 2: Installing Pino

In this step, you install the latest versions of dependencies required for the logging. These include Pino, Express-Pino-logger, and Pino-pretty. You need the following command in your command-line tool from the project’s root directory.

npm install [email protected] [email protected] [email protected]

At this point, you are ready to create a logger service with Pino.

Step 3: Creating the logger service

In this step, you create a Pino-logger service with different levels of logs, like warning, error, info, etc.

After that, you configure this logger-service in your app using Node.js middleware. Start by creating a new services directory in the root folder:

mkdir services

Inside of this new directory, create a new loggerService.js file and add the following code:

const pino = require('pino')
module.exports = pino({})

This code defines the most basic logger service that you can create using Pino-logger. The exported pino function takes two optional arguments, options, and destination, and returns a logger instance.

However, you are not passing any options currently because you will configure this logger service in the later steps. But, this can create a little problem with this logger-service: the JSON log that you will see in a minute is not readable. So, to change it into the readable format, you mention the prettyPrint option in the exported pino function and, after that, your loggerService.js file should look something like this:

const pino = require('pino')
module.exports = pino(
  {
    prettyPrint: true,
  },
)

Configuring your loggerService is covered in later steps.

The next step to complete this logger service is to add the following lines of code in your server.js file in the root directory:

const expressPinoLogger = require('express-pino-logger');
const logger = require('./services/loggerService');

In this code, you are importing the logger service that you just made as well as the express-pino-logger npm package that you installed earlier.

The last step is to configure the express-pino-logger with the logger service that you made. Add this piece of code after const app = express(); in the same file:

// ...

const loggerMidlleware = expressPinoLogger({
  logger: logger,
  autoLogging: true,
});

app.use(loggerMidlleware);

// ...

This code establishes a loggerMiddleware creation using the expressPinoLogger. The first option passed in the function is the logger itself that depicts the loggerService that you created earlier. The second option is autoLogging that can take either true or false as value. It specifies whether you want the JSON response in your logs or not. That’s coming up.

Now, finally, to test the loggerService, revisit your foodRoutes.js file. Import the loggerService with this code at the top:

const logger = require('../services/loggerService')

Then, in the GET route controller method that you created earlier, put this line of code at the start of the callback function:

// ...

app.get("/food", async (request, response) => {
  logger.info('GET route is accessed')
  // ...
});

// ...

The info method is one of the default levels that comes with Pino-logger. Other methods are: fatal, error, warn, debug, trace or silent.

You can use any of these by passing a message string as the argument in it.

Now, before testing the logging service, here the complete code for the server.js file up to this point:

const express = require("express");
const expressPinoLogger = require('express-pino-logger');
const logger = require('./services/loggerService');
const mongoose = require("mongoose");
const foodRouter = require("./routes/foodRoutes.js");
const app = express();
// ...
const loggerMidleware = expressPinoLogger({
  logger: logger,
  autoLogging: true,
});
app.use(loggerMidleware);
// ...
app.use(express.json());
mongoose.connect(
  "mongodb+srv://madmin:<password>@clustername.mongodb.net/<dbname>?retryWrites=true&w=majority",
  {
    useNewUrlParser: true,
    useFindAndModify: false,
    useUnifiedTopology: true
  }
);
app.use(foodRouter);

app.listen(3000, () => {
  console.log("Server is running...");
});

Also, don’t forget to restart your server:

nodemon server.js

Now, you can see the log in your terminal. Test this API route endpoint in Postman, or something like that to see it. After testing the API, you should see something like this in your terminal:

Showing a black terminal window with output, including a first line in bright yellow, a second line in green and rest of the information in white. The information indicates the tool is watching files, starting the node server, when the GET route is accessed, and different API endpoints.

This provides a lot of information:

  • The first piece of the information is the log’s timestamp, which is displayed in the default format, but we can change it into something more readable in later steps.
  • Next is the info which is one of the default levels that comes with Pino-logger.
  • Next is a little message saying that the request has been completed.
  • At last, you can see the whole JSON response for that particular request in the very next line.

Step 4: Configuring the logs

In this step, you learn how to configure the Logger service and how to prettify the logs in your terminal using pino-pretty along with built-in options from the pino package you installed earlier.

Custom levels

At this point, you know that the pino-logger comes with default levels of Logging that you can use as methods to display Logs.

You used logger.info in the previous step.

But, pino-logger gives you the option to use custom levels. Start by revisiting the loggerService.js file in your services directory. Add the following lines of code after you have imported the pino package at the top:

// ...
const levels = {
  http: 10,
  debug: 20,
  info: 30,
  warn: 40,
  error: 50,
  fatal: 60,
};
// ...

This code is a plain JavaScript object defining additional logging levels. The keys of this object correspond to the namespace of the log level, and the values should be the numerical value of that level.

Now, to use this, you have to specify all that in the exported Pino function that you defined earlier. Remember that the first argument it takes is an object with some built-in options.

Rewrite that function like this:

module.exports = pino({
  prettyPrint: true,
  customLevels: levels, // our defined levels
  useOnlyCustomLevels: true,
  level: 'http',
})

In the above code:

  • The first option, customLevels: levels, specifies that our custom log levels should be used as additional log methods.
  • The second option, useOnlyCustomLevels: true, specifies that you only want to use your customLevels and omit Pino’s levels.

/explanation To specify second option, useOnlyCustomLevels, Logger’s default level must be changed to a value in customLevels. That is why you specified the third option.

Now, you can again test your loggerService and try using it with one of your customLevels. Try it with something like this in your foodRoutes.js file:

// ...

app.get"/foods", async (request, response) => {
    logger.http('GET route is accessed')
});

// ...

/explanation Don’t forget to make the autoLogging: false in your server.js file because there is no actual need for the irrelevant JSON response that comes with it.

const pino = require('pino')
const levels = {
  http: 10,
  debug: 20,
  info: 30,
  warn: 40,
  error: 50,
  fatal: 60,
};
module.exports = pino(
  {
    prettyPrint: true,
    customLevels: levels, // our defined levels
    useOnlyCustomLevels: true,
    level: 'http',
  },
)

You should get something like this in your terminal:

A black terminal window that shows the node server starting in green, a note that the server is running, and a timestamp for when the GET route is accessed.

And, all the unnecessary information should be gone.

Pretty printing the Logs

Now you can move ahead and prettify the logs. In other words, you are adding some style to the terminal output that makes it easier (or “prettier”) to read.

Start by passing another option in the exported pino function. Your pino function should look something like this once that option is added:

module.exports = pino({
  customLevels: levels, // our defined levels
  useOnlyCustomLevels: true,
  level: 'http',
  prettyPrint: {
    colorize: true, // colorizes the log
    levelFirst: true,
    translateTime: 'yyyy-dd-mm, h:MM:ss TT',
  },
})

You have added another option, prettyPrint, which is a JavaScript object that enables pretty-printing. Now, inside this object, there are other properties as well:

  • colorize: This adds colors to the terminal logs. Different levels of logs are assigned different colors.
  • levelFirst: This displays the log level name before the logged date and time.
  • translateTime: This translates the timestamp into a human-readable date and time format.

Now, try the API endpoint again, but before that, make sure to put more than one logging statement to take a look at different types of logs in your terminal.

// ...

app.get("/foods", async (request, response) => {
  logger.info('GET route is accessed')
  logger.debug('GET route is accessed')
  logger.warn('GET route is accessed')
  logger.fatal('GET route is accessed')

// ...

You should see something like this in your terminal:

A black terminal window with the same information as before, but with colored labels for different lines of information, like a red label for a fatal message.

At this point, you have configured your logger service enough to be used in a production-grade application.

Step 5: Storing logs in a file

In this last step, you learn how to store these logs in a separate log file. Storing logs in a separate file is pretty easy. All you have to do is make use of the destination option in your exported pino-function.

You can start by editing the pino-function by passing the destination option to it like this:

module.exports = pino(
  {
    customLevels: levels, // the defined levels
    useOnlyCustomLevels: true,
    level: 'http',
    prettyPrint: {
      colorize: true, // colorizes the log
      levelFirst: true,
      translateTime: 'yyyy-dd-mm, h:MM:ss TT',
    },
  },
  pino.destination(`${__dirname}/logger.log`)
)

pino.destination takes the path for the log file as the argument. The __dirname variable points to the current directory, which is the services directory for this file.

/explanation You added the logger.log file in your path even though it doesn’t exist yet. That’s because the file is created automatically when saving this file. If, for some reason, it does not create the file, you can create one manually and add it to the folder.

Here is the complete loggerService.js file:

const pino = require('pino')
const levels = {
  http: 10,
  debug: 20,
  info: 30,
  warn: 40,
  error: 50,
  fatal: 60,
};
module.exports = pino(
  {
    customLevels: levels, // our defined levels
    useOnlyCustomLevels: true,
    level: 'http',
    prettyPrint: {
      colorize: true, // colorizes the log
      levelFirst: true,
      translateTime: 'yyyy-dd-mm, h:MM:ss TT',
    },
  },
  pino.destination(`${__dirname}/logger.log`)
)

Test your API again, and you should see your logs in your log file instead of your terminal.

Conclusion

In this article, you learned how to create a logging service that you can use in production-grade applications. You learned how to configure logs and how you can store those logs in a separate file for your future reference.

You can still experiment with various configuring options by reading the official Pino-logger documentation.

Here are a few best practices you can keep in mind when creating a new logging service:

  • Context: A log should always have some context about the data, the application, the time, etc.
  • Purpose: Each log should have a specific purpose. For example, if the given log is used for debugging, then you can make sure to delete it before making a commit.
  • Format: The format for all the logs should always be easy to read.