DEV Community

Cover image for How to Build an LRU Cache in Less Than 100 Lines of Code
Sun-Li Beatteay
Sun-Li Beatteay

Posted on • Edited on • Originally published at Medium

How to Build an LRU Cache in Less Than 100 Lines of Code

Note: All the code in this article can be found on GitHub.

When I was interviewing for my first software engineering job, I was asked about least recently used (LRU) caches a number of times. I was asked to both code them and to describe how they’re used in larger systems. And if you’ve done your fair share of coding interviews, it’s likely you have, too.

From an interviewer's perspective, caching makes for a versatile topic. It can be used to gauge someone’s low-level understanding of data structures and algorithms. It can also be turned around to challenge one’s high-level comprehension of distributed systems.

For job candidates, however, it can be a jarring experience — especially if you’ve never used them in a professional setting. That was the case for me. The first time I was asked about LRU caches, my mind went blank. I’d heard of them, sure. But in all my studying of binary trees and heaps, I hadn’t bothered to learn what goes into them. And live in front of an interviewer isn’t the ideal setting to try to work it out.

While I didn’t do too well in that interview, that doesn’t have to be your fate. In this article, I’m going to teach you the what and how of LRU caches. By the end, you will know how to implement your own cache in less than 100 lines of code without any third-party libraries — all within ten minutes.

So let’s get going — time is ticking.

What Is a Least Recently Used (LRU) Cache?

Caches are a type of data storage device that typically stores data in memory for fast retrieval. They are generally implemented as a key-value store, meaning you store and access the data via an identifying key of some sort.

The RAM on your computer is an example of a cache. The operating system stores data in RAM for faster access than the hard drive, and it uses the address of the memory cell as the key.

Operating system caches (Image Credit: [https://www.sciencedirect.com](https://www.sciencedirect.com/topics/computer-science/disks-and-data))Operating system caches (Image Credit: https://www.sciencedirect.com)

LRU caches are a specific type of cache with a unique feature. When an LRU cache runs out of space and needs to remove data, it will evict the key-value pair that was least recently fetched from the cache.

Popular open source examples of LRU caches are Redis and Memcached.

Why Are LRU Caches Useful?

For tech businesses that provide an API or user interface, performance and availability are crucial. Think about the last time you visited a slow website. Did you stay and wait for it to load, or did you leave and visit another site? Most likely the latter. Slow or under-performing websites can result in millions in lost revenue.

Unfortunately, many of the same systems that rely on high uptime also have to store mountains of data. This is often the case for social media and e-commerce sites. These sites store their data in a database of some sort, be it SQL or NoSQL. While this is standard practice, the problem comes when you need to fetch that data. Querying databases, especially when they contain a lot of data, can be quite slow.

Enter the cache.

Since caches keep data in memory, they are much more performant than traditional databases. And for social media sites, where 20% of the most popular content drives 80% of the traffic, caching can dramatically decrease the load on the databases.

SQL vs. cache speed comparison (Image Credit: [https://dzone.com/articles/redis-vs-mysql-benchmarks](https://dzone.com/articles/redis-vs-mysql-benchmarks))SQL vs. cache speed comparison (Image Credit: https://dzone.com/articles/redis-vs-mysql-benchmarks)

The next time an interviewer asks you how to optimize an API endpoint or workflow that requires fetching the same data over and over, caching is a good place to start.

However, knowing how caches work and how to use them is easy. Knowing how to build one yourself is the hard part. And that’s exactly what we’ll be focusing on.

How to Build an LRU Cache

For the purposes of this article, I will be using Python to implement the LRU cache. It’s succinct, easy to read, and a lot of people know it. However, if Python isn’t your thing or you’re curious about how to implement it in other languages, you can check out my examples repository.

Requirements

Before we start building our cache, we have to understand what the requirements are. The first will be the API. Which methods do we need to implement?

While production quality caches are feature rich, we’ll keep it simple. You’ll only need to create two methods:

  • get(key)

  • set(key, value)

But that isn’t all. The LRU cache itself has a number of requirements:

  1. When the max size of the cache is reached, remove the least recently used key.

  2. Whenever a key is fetched or updated, it becomes the most recently used.

  3. Both get and set operations must complete in O(1) time complexity (meaning that no matter how large the cache is, it takes the same amount of time to complete).

  4. When fetching a key that doesn’t exist, return a null value.

With these requirements in mind, we can get to work.

Photo by [ConvertKit](https://unsplash.com/@convertkit?utm_source=medium&utm_medium=referral) on [Unsplash](https://unsplash.com?utm_source=medium&utm_medium=referral)Photo by ConvertKit on Unsplash

Data structure

The first question that we need to answer is, which data structure should back our LRU cache? After all, a cache is not a data structure itself.

Since the cache uses get and set and needs to operate in O(1) time, you might’ve thought of a hash map or dictionary. That would indeed fulfill some requirements. But what about removing the LRU key? A dictionary doesn’t allow us to know which is the oldest key.

We could use a time stamp as part of the dictionary value and update it whenever the key is fetched. This would tell us which key-value pair is the oldest. The problem is, we would need to loop through all the dictionary entries to check which is the oldest — breaking our O(1) requirement.

So what’s the answer?

This is where I should let you in on a secret. We’re actually going to need two data structures: one for fetching the values (dictionary/hash map) and one for keeping the items sorted by frequency of use.

The second data structure

So what should the second data structure be? If you thought of an array, you’re getting close.

We can use an array to keep the items sorted. And each key in the dictionary can reference the index of a value in the array. Whenever a key is fetched, we can move that value to the front of the array, pushing the rest of the items back, and update the index in the dictionary.

However, there’s still a small issue. Under the hood, arrays are a continuous row of data cells. If we need to move values to the front of the array and push the rest down, that operation will get slower as the array increases in size. In other words, inserting items at the beginning of an array takes O(N) time.

Image Credit: AuthorImage Credit: Author

But we are close. We just need a data structure with the same sorting benefits of an array that can also get, set, and delete data in O(1) time. And the data structure that fits all of those requirements is a doubly linked list (DLL).

A doubly linked list is the same as a linked list, except each node in the list has an additional reference to the previous node as well as the next node.

Image Credit: [https://medium.com/flawless-app-stories/doubly-linked-lists-swift-4-ae3cf8a5b975](https://medium.com/flawless-app-stories/doubly-linked-lists-swift-4-ae3cf8a5b975)Image Credit: https://medium.com/flawless-app-stories/doubly-linked-lists-swift-4-ae3cf8a5b975

Each key in the dictionary will reference a node in our linked list. This way, we’ll be able to easily fetch data, as well as update and delete it.

Image Credit: [https://corvostore.github.io/#LRU](https://corvostore.github.io/#LRU)Image Credit: https://corvostore.github.io/#LRU

Building the LRU Cache

Now that we know the data structure makeup of our LRU cache, we can start building it!

class LRUCache(object):
    def __init__(self, max_size=10):
        if max_size <= 0:
            raise Exception('Max size must be larger than zero')
        self.max_size = max_size
        self.list = DoublyLinkedList()
        self.nodes = {}

    def set(self, key, value):
        node = self.nodes.get(key, None)

        #if the key already exists, update the value
        if node != None:
            node.data.value = value
            self.list.move_front(node)
            return

        # if the cache has reached its max size, remove the least recently used key
        if self.list.capacity == self.max_size:
            expired = self.list.remove_tail()
            del self.nodes[expired.data.key]

        self.nodes[key] = self.list.unshift(KVPair(key, value))

    def get(self, key):
        node = self.nodes.get(key, None)

        # if the key doesn't exist, return None
        if node is None:
            return None

        # make the key to the front of linked list (most recently used)
        self.list.move_front(node)
        return node.data.value
Enter fullscreen mode Exit fullscreen mode

Looking at this code, you may assume that we’re done. But there’s a catch. Did you notice the DoublyLinkedList on Line 6? Python doesn’t come with a DoublyLinkedList in the standard library.

In fact, the majority of programming languages don’t come equipped with a DLL. And since live coding challenges rarely let you use third-party libraries, we will have to implement it ourselves.

Building the Doubly Linked List

When it comes to crafting a DLL, the most important aspect is the design. There are many different types of linked lists with different tradeoffs, but for our use cases, we’ll be making a circular DLL. This will give us quick access to the head and tail, which is important for an LRU cache.

Circular doubly linked list w/ root node (Image Credit: Author)Circular doubly linked list w/ root node (Image Credit: Author)

class Node(object):
    def __init__(self, data):
        self.data = data
        self.next = None
        self.prev = None

class DoublyLinkedList(object):
    def __init__(self):
        self.root = Node(None)
        self.capacity = 0

        # the root node should point to itself when the list is empty
        self.root.next = self.root
        self.root.prev = self.root
Enter fullscreen mode Exit fullscreen mode

While linked lists usually come with many convenience methods, we only need to implement a small subset of them:

  1. move_front(node)

  2. unshift(value)

  3. remove_tail

MoveFront

Whenever a key is fetched from our cache, we will need to make that key the most recently used. This involves moving that key to the front of the DLL.

Implementing move_front can be a bit confusing as it involves both removing a node from the list and inserting it in a different location. We do this by changing and swapping the next and prev on a number of surrounding nodes.

Move Front animation (Image Credit: Author)Move Front animation (Image Credit: Author)

def move_front(self, node):
    # guard against empty nodes
    if node is None:
        return None

    # Step 1: remove the node from its current position
    node.next.prev = node.prev
    node.prev.next = node.next

    # Step 2: change the node so it points to the root and the old head node
    node.prev = self.root
    node.next = self.root.next # old head node

    # Step 3: Update the root node so it points to the new head
    self.root.next.prev = node # update old head prev pointer
    self.root.next = node
    return node
Enter fullscreen mode Exit fullscreen mode

Unshift

The unshift method for a DLL works very similarly to an array where we insert a value at the front or head of the list. Our LRU cache will use this method whenever set(key, value) is called.

Luckily for us, unshift shares a lot of logic with move_front. If we update move_front slightly, we can use it to implement unshift.

Unshift animation (Image Credit: [https://visualgo.net](https://visualgo.net/en/list))Unshift animation (Image Credit: https://visualgo.net)

def unshift(self, data):
    node = Node(data)
    self.move_front(node)
    self.capacity += 1
    return node

def move_front(self, node):
    if node is None:
        return None
    elif node.prev is not None and node.next is not None:
        # if node is already in the list, remove it from its current position
        node.next.prev = node.prev
        node.prev.next = node.next

    node.prev = self.root
    node.next = self.root.next

    self.root.next.prev = node
    self.root.next = node
    return node
Enter fullscreen mode Exit fullscreen mode

RemoveTail

If our cache reaches its max size, we will need to remove the least recently used key. This involves removing the tail of our DLL. Similar to unshift, remove_tail shares some common logic with move_front. While we can’t reuse move_front for remove_tail, we can abstract some of the common logic.

Remove Tail animation (Image Credit: [https://visualgo.net](https://visualgo.net/en/list))Remove Tail animation (Image Credit: https://visualgo.net)

def remove_tail(self):
    # if the list is already empty, bail out
    if self.capacity == 0:
        return None

    removed = self.isolate(self.root.prev)
    self.capacity -= 1
    return removed

@staticmethod
# isolate removes a node from its current position
# it also sets its next and prev pointers to None to prevent memory leaks
def isolate(node):
    node.next.prev = node.prev
    node.prev.next = node.next
    node.next = None
    node.prev = None
    return node
Enter fullscreen mode Exit fullscreen mode

Putting It All Together

When we combine our LRU cache and DLL, this is what we get:

class KVPair(object):
    def __init__(self, k, v):
        self.key = k
        self.value = v

class Node(object):
    def __init__(self, data):
        self.data = data
        self.next = None
        self.prev = None

class DoublyLinkedList(object):
    def __init__(self):
        self.root = Node(None)
        self.capacity = 0

        self.root.next = self.root
        self.root.prev = self.root

    def unshift(self, data):
        node = Node(data)
        self.move_front(node)
        self.capacity += 1
        return node

    def move_front(self, node):
        if node is None:
            return None
        elif node.prev is not None and node.next is not None:
            self.isolate(node)

        node.prev = self.root
        node.next = self.root.next
        self.root.next.prev = node
        self.root.next = node
        return node

    def remove_tail(self):
        if self.capacity == 0:
            return None

        removed = self.isolate(self.root.prev)
        self.capacity -= 1
        return removed

    @staticmethod
    def isolate(node):
        node.next.prev = node.prev
        node.prev.next = node.next
        node.next = None
        node.prev = None
        return node

class LRUCache(object):
    def __init__(self, max_size=10):
        if max_size <= 0:
            raise Exception('Max size must be larger than zero')
        self.max_size = max_size
        self.list = DoublyLinkedList()
        self.nodes = {}

    def set(self, key, value):
        node = self.nodes.get(key, None)
        if node != None:
            node.data.value = value
            self.list.move_front(node)
            return

        if self.list.capacity == self.max_size:
            expired = self.list.remove_tail()
            del self.nodes[expired.data.key]

        self.nodes[key] = self.list.unshift(KVPair(key, value))

    def get(self, key):
        node = self.nodes.get(key, None)
        if node is None:
            return None

        self.list.move_front(node)
        return node.data.value
Enter fullscreen mode Exit fullscreen mode

And just like that, 10 minutes later, we’ve created a basic LRU cache in less than 100 lines of code. For extra practice, you can try adding the ability to delete a key, creating simple unit tests to prove it works, or creating it in a different language.

As I mentioned before, if Python isn’t your thing, don’t worry. I have more examples for JavaScript, Ruby, and Go. I’m also open to pull requests for other languages.

I hope you found this deep dive into LRU caches useful and informative. It can be a confusing topic to grasp, but with some practice, you’ll be able to build one in your sleep. So the next time an interviewer tries to challenge you on caches, show them what you’ve learned.

Top comments (0)