Speeding Up Your API Performance with Redis Caching

Speeding Up Your API Performance with Redis Caching

In this article, we will learn:

  • Why API performance is important

  • What is Redis, and some of the use cases

  • What are caching strategies and some examples

  • How to Utilize a database caching strategy in a node js application

Introduction

API performance is an important factor in the success of any application. It affects the overall user experience and can have a direct impact on revenue. To illustrate this, let's look at an example of a popular online marketplace. This marketplace has millions of users and relies heavily on its APIs for a variety of tasks. Its APIs are used to create listings, search for products, process payments, and more.

A poor-performing API could mean customers have difficulty using the website or are unable to use certain features. This can lead to frustrated customers, longer wait times, and ultimately a loss of revenue.

On the other hand, a well-performing API ensures a smooth experience for customers. This could mean faster search results, quicker page loading times, and a more enjoyable experience overall. This could result in more customers using the marketplace, more conversions, and an increase in revenue. It's clear that API performance is a critical factor in the success of any application.

Poor performance can lead to unhappy customers and lost revenue, while good performance can lead to a better user experience and more revenue. As such, it's important for any application to ensure its APIs are optimized for the best possible performance.

What is Redis?

Redis is a powerful and popular open-source data structure store that can be used for a variety of use cases. It is a great choice for storing data in memory for quick retrieval, caching, and message queues.

To give an example of how Redis can be used, Let’s imagine an e-commerce company that needs to show its customers a list of the most popular products on its website. To do this, they need to store the most popular products in a way that is easy to query and quickly retrieve. Redis is the perfect choice for this use case. By storing the most popular products in a Redis database, the company can quickly look up the most popular items and display them to its customers.

Redis is also a great choice for caching data. Caching allows companies to store frequently accessed data in memory, making queries faster and more efficient. Redis can store any type of data - from images to text - and can be used to create a cache that can store a wide variety of information.

Redis is a great choice for many use cases, and it can be used to make websites and applications more efficient. It is a powerful and popular data structure store that can be used for many different tasks.

Caching Strategies

Caching is one of the most powerful and efficient tools for improving the performance of a given application. It's a process of temporarily storing frequently-accessed data, known as "cached" data, in a more accessible location, where it can be quickly retrieved when needed. This allows us to have faster access to the data in a more efficient manner, reducing the burden on our systems and improving user experience.

Going back to our e-commerce example, every time a user visits the site, the web server needs to retrieve and deliver the content they requested. This can quickly become expensive if the website receives a large number of visitors. To reduce the burden on the server, the developer can implement a caching strategy, which will store the data requested by visitors in a more accessible location. This means that, instead of loading the data from the server each time the page is requested, the web server can simply retrieve the cached data and deliver it to the user. This results in faster loading times, as the data is already stored in a more accessible location.

Caching strategies can also be used to reduce the load on your system when dealing with large databases. For example, a web application that requires access to large amounts of data may find its performance degrading over time as the size of the database grows. By caching the frequently used data, the application can reduce the number of queries it needs to make to the database, resulting in faster response times and improved user experience.

Overall, caching strategies are a powerful tool for improving application performance and user experience in a number of different ways. Whether you’re looking to speed up a website, reduce the burden on a database, or have some other use case, caching can be an invaluable tool.

Some of the popular caching strategies include:

  1. Browser Caching: Browser caching involves storing website resources on the user’s computer so that they don’t need to be downloaded again when they are requested. This helps reduce the amount of bandwidth needed to deliver content, as well as reduce server load.

  2. Edge Caching: Edge caching is a type of caching in which frequently requested webpages or objects are stored on a distributed cache at the edge of the network. This helps reduce latency and improve performance since the data is closer to the user.

  3. Application Caching: Application caching is a caching technique used to store frequently requested data in memory, so it can be quickly retrieved without having to go to the database or make a network call. This helps increase performance by reducing database queries and network requests.

  4. Database Caching: Database caching is a technique used to store database query results in a cache for faster retrieval. The database cache stores the query results locally so that the database does not have to be queried each time the same query is requested. This reduces the load on the database and speeds up performance.

  5. Content Delivery Network (CDN) Caching: Content Delivery Networks (CDN) are a distributed network of servers located around the world that store copies of webpages and content. They are used to reduce latency and improve performance by serving content from the server that is closest to the user.

  6. Object Caching: Object caching is a type of caching that stores the results of expensive computations in memory, so they don’t have to be recalculated each time they are requested. This helps reduce the load on the server and improves performance.

Hands-on caching example with Redis

We are going to be building a simple server that returns the species from Django REST framework (swapi.dev)

Pre-requisite: -

  • Nodejs

  • A code editor, (preferably visual studio code)

First, let’s create a folder to store our project and initialize the npm project.

mkdir  cache-with-redis 
cd cache-with-redis
yarn init  -y

Then, we need to install our project dependencies, which include Axios, express, and redis

yarn add axios express redis

Now, we create a file index.js in our root directory. In the index.Js file, we will create an express app,

#!/usr/bin/env node 
const  express = require(“express”);

const app = express(); 
const port = 8000; 
app.listen(port,  ()=>
{ console.log(`App is running on port: ${port}`) 
})

Note: the #!/usr/bin/env node is known as shebang, which makes any file run as a script without appending the keyword node, to make this work, we run chmod +x ./index.js in the terminal

Now we run the application with

 ./index.js

We should get the response App is running on port: 8000

Now we need to make an API call, in index.js, we need to make an HTTP request to get the species data.

We update the index.Js file with the following,

`


const axios = require(“axios”)

const fetchSpecieData = async () =>{ 
const  response =  await axios.get("https://swapi.Dev/api/species") 
return response.data;
} 

app.get("/api/species", async  (req, res) =>{  
let isCached = false; 
const data = await fetchSpecieData() 
res.Json({ 
FromCache: isCached, 
data
 })
 })

Now, let's make a request to our server,

Curl http://localhost:8000/api/species

We will get a result similar to the image below,

Note the rectangle in the image.

We need to feature Redis in the show, by caching the API call. This works in just two-step, first, we set a key by initializing it with some values, then for subsequent calls, we get the values stored by the key.

Now let’s code it up;

const redis = require("redis");

const redisClient = redis.createClient();

const connectRedis = async () => {
    redisClient.on("error", (err) => {
        console.log(err)
    })
    await redisClient.connect();
}

connectRedis();

Now, let’s go back to our get endpoint, and do some get-set voodoo,

let isCached = false;
    const data = await fetchSpecieData()
    let cachedData = await redisClient.get("species")

    if (cachedData) {
        isCached = true
        res.json({
            fromCache: isCached,
            data: cachedData
        })
    }

    await redisClient.set("species", JSON.stringify(data));

    res.json({
        FromCache: isCached,
        data
    })

Our final index.js should look like the this,

#!/usr/bin/env node

const express = require("express");
const axios = require("axios");
const redis = require("redis");

const redisClient = redis.createClient();

const connectRedis = async () => {
    redisClient.on("error", (err) => {
        console.log(err)
    })
    await redisClient.connect();
}

connectRedis();

const app = express();
const port = 6000;

const fetchSpecieData = async () => {
    const response = await axios.get("https://swapi.Dev/api/species")
    return response.data;

}

app.get("/api/species", async (req, res) => {
    let isCached = false;
    const data = await fetchSpecieData()
    let cachedData = await redisClient.get("species")

    if (cachedData) {
        isCached = true
        res.json({
            fromCache: isCached,
            data: cachedData
        })

    }

    await redisClient.set("species", JSON.stringify(data));

    res.json({
        FromCache: isCached,
        data
    })
})

app.listen(port, () => {
    console.log(`App is running on port: ${port}`)
})

Now, we will restart our server, and let’s make 3 requests to our server, we will open 3 different terminals, and compare the result for each.

First Request

Second Request

Second Request

Now we clearly see, on the first request, the fromCache field was false, but for the remaining request, the field was set to True.

Conclusion

In this article, we have been able to go through the Redis landscape and also built a simple application with Redis.

Now we have been able to see that, caching is an invaluable tool for ensuring that webpages and API perform optimally, no matter how large or small the websites may be. With an understanding of the various caching strategies, businesses can ensure that their webpage is loading quickly and efficiently for all users, allowing them to provide the best experience possible.

For more content like this, you can reach out to me on Twitter and Linkedin