Implementing LRU cache with Node.js and Typescript

Generated using Lexica

Introduction

Undoubtedly, caching plays an essential role in heavily utilized services and applications. Frequently, caching data is necessary to enhance response times and decrease the costs associated with querying data sources for previously retrieved information that remains unchanged.

Numerous caching libraries and techniques exist; this blog post will demonstrate how to develop a clear yet efficient LRU cache using Node.js and TypeScript.

What is LRU cache?

An LRU cache is a cache with a fixed size that replaces the least used item with a new one, effectively managing its data. This cache ensures that frequently accessed items are consistently present while those accessed less frequently are removed.

Implementing LRU cache with Node.js and Typescript

In this part, I will show you how to step by step prepare your project, so follow me 🙂

Step 1: Setting up the project

Please create a new directory and navigate to it

mkdir lru-cache-node-ts
cd lru-cache-node-ts

We use npm to initiate the project and install the dependencies

npm init -y
npm install typescript ts-node @types/node

In the next step, we configure the tsconfig.json file

{
"compilerOptions": {
"target": "ES2019",
"module": "commonjs",
"strict": true,
"esModuleInterop": true,
"outDir": "dist"
},
"include": ["src"]
}

Step 2: Implement the LRU Cache

The method is straightforward: we create a FIFO (First In, First Out) queue with a fixed size for the cache. When the cache reaches its limit, we remove the first item from the list. Frequently accessed items continuously move to the end of the queue, reducing the likelihood of being evicted from the cache.

// Define an interface for the cache items with key-value pairs
interface CacheItem<T> {
key: string;
value: T;
}

// Create a generic LRU cache class
class LRU<T> {
// Define the maximum cache size and the cache data structure
private readonly maxSize: number
private cache: Map<string, CacheItem<T>>

// Initialize the LRU cache with a specified maximum size
constructor(maxSize: number) {
this.maxSize = maxSize
this.cache = new Map<string, CacheItem<T>>()
}

// Add an item to the cache, evicting the least recently used item if the cache is full
set(key: string, value: T): void {
// find the least recently used item and remove it from the cache
// get the list of keys in the cache and get the first one
if (this.cache.size >= this.maxSize) {
const lruKey = this.cache.keys().next().value
// remove the least recently used item from the cache
this.cache.delete(lruKey)
}
this.cache.set(key, { key, value })
}

// Retrieve an item from the cache, and update its position as the most recently used item
get(key: string): T | undefined {
const item = this.cache.get(key)
if (item) {
this.cache.delete(key)
this.cache.set(key, item)
return item.value
}
return undefined
}

// Remove an item from the cache by its key
delete(key: string): void {
this.cache.delete(key)
}

// Clear the cache
clear(): void {
this.cache.clear()
}
}

export default LRU

Step 3: Using LRU cache

import LRU from './lruCache';

const cache = new LRU<string>(3);

cache.set('one', '1');
cache.set('two', '2');
cache.set('three', '3');

console.log(cache.get('one')); // '1'
console.log(cache.get('two')); // '2'
console.log(cache.get('three')); // '3'

// This will evict the 'one' key, as it's the least recently used item.
cache.set('four', '4');

console.log(cache.get('one')); // undefined
console.log(cache.get('four')); // '4'

Conclusion

The LRU cache is one of the simplest yet highly effective caching algorithms that can be implemented in your service. When used correctly, it can significantly boost your service’s performance. Regardless of your caching mechanism, ensuring that your service does not run out of memory is crucial. That’s why limiting your cache is essential to any caching strategy.

Level Up Coding

Thanks for being a part of our community! Before you go:

🚀👉 Join the Level Up talent collective and find an amazing job


Implementing LRU cache with Node.js and Typescript was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.


This content originally appeared on Level Up Coding - Medium and was authored by Mehran

Generated using Lexica

Introduction

Undoubtedly, caching plays an essential role in heavily utilized services and applications. Frequently, caching data is necessary to enhance response times and decrease the costs associated with querying data sources for previously retrieved information that remains unchanged.

Numerous caching libraries and techniques exist; this blog post will demonstrate how to develop a clear yet efficient LRU cache using Node.js and TypeScript.

What is LRU cache?

An LRU cache is a cache with a fixed size that replaces the least used item with a new one, effectively managing its data. This cache ensures that frequently accessed items are consistently present while those accessed less frequently are removed.

Implementing LRU cache with Node.js and Typescript

In this part, I will show you how to step by step prepare your project, so follow me :)

Step 1: Setting up the project

Please create a new directory and navigate to it

mkdir lru-cache-node-ts
cd lru-cache-node-ts

We use npm to initiate the project and install the dependencies

npm init -y
npm install typescript ts-node @types/node

In the next step, we configure the tsconfig.json file

{
"compilerOptions": {
"target": "ES2019",
"module": "commonjs",
"strict": true,
"esModuleInterop": true,
"outDir": "dist"
},
"include": ["src"]
}

Step 2: Implement the LRU Cache

The method is straightforward: we create a FIFO (First In, First Out) queue with a fixed size for the cache. When the cache reaches its limit, we remove the first item from the list. Frequently accessed items continuously move to the end of the queue, reducing the likelihood of being evicted from the cache.

// Define an interface for the cache items with key-value pairs
interface CacheItem<T> {
key: string;
value: T;
}

// Create a generic LRU cache class
class LRU<T> {
// Define the maximum cache size and the cache data structure
private readonly maxSize: number
private cache: Map<string, CacheItem<T>>

// Initialize the LRU cache with a specified maximum size
constructor(maxSize: number) {
this.maxSize = maxSize
this.cache = new Map<string, CacheItem<T>>()
}

// Add an item to the cache, evicting the least recently used item if the cache is full
set(key: string, value: T): void {
// find the least recently used item and remove it from the cache
// get the list of keys in the cache and get the first one
if (this.cache.size >= this.maxSize) {
const lruKey = this.cache.keys().next().value
// remove the least recently used item from the cache
this.cache.delete(lruKey)
}
this.cache.set(key, { key, value })
}

// Retrieve an item from the cache, and update its position as the most recently used item
get(key: string): T | undefined {
const item = this.cache.get(key)
if (item) {
this.cache.delete(key)
this.cache.set(key, item)
return item.value
}
return undefined
}

// Remove an item from the cache by its key
delete(key: string): void {
this.cache.delete(key)
}

// Clear the cache
clear(): void {
this.cache.clear()
}
}

export default LRU

Step 3: Using LRU cache

import LRU from './lruCache';

const cache = new LRU<string>(3);

cache.set('one', '1');
cache.set('two', '2');
cache.set('three', '3');

console.log(cache.get('one')); // '1'
console.log(cache.get('two')); // '2'
console.log(cache.get('three')); // '3'

// This will evict the 'one' key, as it's the least recently used item.
cache.set('four', '4');

console.log(cache.get('one')); // undefined
console.log(cache.get('four')); // '4'

Conclusion

The LRU cache is one of the simplest yet highly effective caching algorithms that can be implemented in your service. When used correctly, it can significantly boost your service’s performance. Regardless of your caching mechanism, ensuring that your service does not run out of memory is crucial. That’s why limiting your cache is essential to any caching strategy.

Level Up Coding

Thanks for being a part of our community! Before you go:

🚀👉 Join the Level Up talent collective and find an amazing job


Implementing LRU cache with Node.js and Typescript was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.


This content originally appeared on Level Up Coding - Medium and was authored by Mehran


Print Share Comment Cite Upload Translate Updates
APA

Mehran | Sciencx (2023-04-18T23:05:19+00:00) Implementing LRU cache with Node.js and Typescript. Retrieved from https://www.scien.cx/2023/04/18/implementing-lru-cache-with-node-js-and-typescript/

MLA
" » Implementing LRU cache with Node.js and Typescript." Mehran | Sciencx - Tuesday April 18, 2023, https://www.scien.cx/2023/04/18/implementing-lru-cache-with-node-js-and-typescript/
HARVARD
Mehran | Sciencx Tuesday April 18, 2023 » Implementing LRU cache with Node.js and Typescript., viewed ,<https://www.scien.cx/2023/04/18/implementing-lru-cache-with-node-js-and-typescript/>
VANCOUVER
Mehran | Sciencx - » Implementing LRU cache with Node.js and Typescript. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2023/04/18/implementing-lru-cache-with-node-js-and-typescript/
CHICAGO
" » Implementing LRU cache with Node.js and Typescript." Mehran | Sciencx - Accessed . https://www.scien.cx/2023/04/18/implementing-lru-cache-with-node-js-and-typescript/
IEEE
" » Implementing LRU cache with Node.js and Typescript." Mehran | Sciencx [Online]. Available: https://www.scien.cx/2023/04/18/implementing-lru-cache-with-node-js-and-typescript/. [Accessed: ]
rf:citation
» Implementing LRU cache with Node.js and Typescript | Mehran | Sciencx | https://www.scien.cx/2023/04/18/implementing-lru-cache-with-node-js-and-typescript/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.