16.25 Least Recently Used (LRU) Cache Problem Explained With Solutions In Ruby, Javascript and Python: Mastering Algorithms

Cracking the Code: Solutions in Ruby, JavaScript, and Python

Patrick Karsh
5 min readOct 4, 2023

The Least Recently Used (LRU) Cache problem is a common computer science and software engineering problem that involves designing a data structure that efficiently maintains a limited number of items with a specific constraint: when the cache becomes full and needs to evict an item to make space for a new one, it should remove the least recently used item.

In other words, an LRU Cache is a cache that keeps track of the order in which items are accessed or used, and when it reaches its capacity, it removes the item that hasn’t been used for the longest time, making room for a new item to be added.

Here are the key operations associated with an LRU Cache:

  1. Get(key): Retrieve the value associated with a given key from the cache. If the key exists in the cache, it is considered a “use” of that item, and it becomes the most recently used item.
  2. Put(key, value): Add a new key-value pair to the cache. If the cache is already at its capacity, it should remove the least recently used item before adding the new one.

The goal is to implement these operations efficiently in terms of time complexity. Typically, this problem is used to test a developer’s understanding of data structures like hash maps and doubly-linked lists.

One common approach to solving the LRU Cache problem involves using a combination of a hash map and a doubly-linked list:

  • A hash map allows quick lookups of items by their keys.
  • A doubly-linked list helps maintain the order of items based on their usage.

By using these data structures in combination, you can achieve the desired LRU Cache behavior efficiently. When an item is accessed, you can move it to the front of the list, indicating that it’s the most recently used item. When you need to evict an item, you can remove the tail of the list, which represents the least recently used item, and update the hash map accordingly.

The LRU Cache problem is a common interview question and is also encountered in real-world scenarios where you need to optimize data access, such as in databases and caching systems.

Example Solutions

Here’s how you can implement an LRU Cache in ES6 (JavaScript), Ruby, and Python:

ES6 (JavaScript)

class LRUCache {
constructor(capacity) {
this.capacity = capacity;
// Use a Map to store key-value pairs for efficient lookups
this.cache = new Map();
}

get(key) {
if (this.cache.has(key)) {
const value = this.cache.get(key);
// Move the accessed item to the front to mark it as most recently used
this.cache.delete(key);
this.cache.set(key, value);
return value;
}
return -1; // Key not found
}

put(key, value) {
if (this.cache.has(key)) {
// If the key already exists, remove it to update its position
this.cache.delete(key);
} else if (this.cache.size >= this.capacity) {
// Remove the least recently used item (the first item in the Map)
const firstKey = this.cache.keys().next().value;
this.cache.delete(firstKey);
}
// Add the new key-value pair and mark it as most recently used
this.cache.set(key, value);
}
}

Ruby

require 'linked-list'

class LRUCache
def initialize(capacity)
@capacity = capacity
# Use a Hash to store key-value pairs and a linked list to maintain order
@cache = {}
@list = LinkedList.new
end

def get(key)
if @cache.key?(key)
# Move the accessed item to the end of the list (most recently used)
@list.remove(key)
@list.append(key)
return @cache[key]
end
-1 # Key not found
end

def put(key, value)
if @cache.key?(key)
# If the key already exists, remove it to update its position
@list.remove(key)
elsif @cache.length >= @capacity
# Remove the least recently used item (the first item in the list)
lru_key = @list.shift
@cache.delete(lru_key)
end
# Add the new key-value pair and mark it as most recently used
@list.append(key)
@cache[key] = value
end
end

Python

from collections import OrderedDict

class LRUCache:
def __init__(self, capacity):
self.capacity = capacity
# Use an OrderedDict to store key-value pairs and maintain order
self.cache = OrderedDict()

def get(self, key):
if key in self.cache:
# Move the accessed item to the end to mark it as most recently used
self.cache.move_to_end(key)
return self.cache[key]
return -1 # Key not found

def put(self, key, value):
if key in self.cache:
# If the key already exists, remove it to update its position
del self.cache[key]
elif len(self.cache) >= self.capacity:
# Remove the least recently used item (the first item in the OrderedDict)
self.cache.popitem(last=False)
# Add the new key-value pair and mark it as most recently used
self.cache[key] = value

These implementations use different data structures to achieve the LRU Cache behavior but have similar interfaces with get and put methods. They efficiently manage the cache's capacity and maintain the order of items based on their usage.

Space and Time Complexity

Below is a succinct summary of the space and time complexities for the LRU Cache implementations in ES6 (JavaScript), Ruby, and Python:

Space Complexity:

  • All three implementations have a space complexity of O(capacity) because they store a maximum of ‘capacity’ key-value pairs.

Time Complexity:

  • The get(key) and put(key, value) operations have an average time complexity of O(1) for all three implementations because they rely on efficient data structures for key-based lookups and updates (e.g., Map, Hash, OrderedDict).

In summary, these LRU Cache implementations efficiently manage space and time complexities, making them suitable for cache management.

Conclusion

Studying the Least Recently Used (LRU) Cache problem is crucial in computer science and software engineering because it addresses fundamental challenges in optimizing data access and resource management. Caching mechanisms are ubiquitous in modern computing systems, from databases to web servers, enhancing performance and reducing response times. A proficient understanding of LRU Cache algorithms enables developers to design efficient, memory-conscious systems, ensuring that frequently accessed data is readily available while maintaining cache limits. By tackling this problem, one not only strengthens problem-solving skills but also gains valuable insights into critical aspects of real-world applications, contributing to the development of faster and more responsive software systems.

--

--

Patrick Karsh
Patrick Karsh

Written by Patrick Karsh

NYC-based Ruby on Rails and Javascript Engineer leveraging AI to explore Engineering. https://linktr.ee/patrickkarsh

No responses yet