Skip to main content

Caching Algorithms (LRU Cache).

Implementation of LRU Cache Using Python

What is a Cache?

A cache is a hardware or software component that stores data so that future requests for that data can be served faster. The data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere.
 

What Is An LRU Cache?

So an LRU Cache is storage of items so that future access to those items can be serviced quickly and an LRU policy is used for cache eviction.

The Constraints/Operations


    Lookup of cache items must be O(1)

    Addition to the cache must be O(1)


Our Structures

Doubly Linked List: This will hold the items that our cache has. We will have n items in the cache.

This structure satisfies the constraint for fast addition since any doubly linked list item can be added or removed in O(1) time with proper references.

Hashtable: The hashtable will give us fast access to any item in the doubly linked list items to avoid O(n) search for items and the LRU entry (which will always be at the tail of the doubly linked list).

This is a very common pattern, we use a structure to satisfy special insertion or remove properties (use a BST, linked list, etc.) and back it up with a hashtable so we do not re-traverse the structures every time to find elements.


Complexities & Time
    Both get and put methods are O( 1 ) in time complexity.
Space
    We use O( n ) space since we will store n items where n is the capacity of the cache.


Approach

We will be using the following Approach for Implementing LRU Cache.


1) Doubly Linked List:
Doubly Linked List is easy to manipulate since both the starting and the next position is present you can easily remove any node from between.

2) Hashing Table:
 Hashing Table is used to maintain the data and check whether the value is present in Linked List or Not. 
For Hashing Table I'll be using Dictionary in Python.


Code 

Code for the Following can be found at the Link.
Node 

For Node, let's create a class that will hold the Following Values.
  1. Node Key -- Key through the cache value will be fetched/
  2. Value -- Value for the Key
  3. Next Node --  Object for the Next Node
  4. Previous Node -- Object for the Previous Node.
class Node:
    def __init__(self, node_key, value):
        self.node_key = node_key
        self.value = value
        self.next_node = None
        self.prev_node = None

LRU Cache Code.

Check the following implementation for the cache
class LRUCache:

    def __init__(self, capacity):
        self.capacity = capacity
        self.head = Node('head', 'head')
        self.tail = Node('tail', 'tail')
        self.head.next_node = self.tail
        self.tail.prev_node = self.head
        self.current_size = 0
        self.hash_table = OrderedDict()

    def remove_node(self, node):
        node_after = node.next_node
        node_before = node.prev_node
        node_before.next_node = node_after
        node_after.prev_node = node_before

    def add_node(self, node):
        prev_node_tail = self.tail.prev_node
        prev_node_tail.next_node = node
        node.prev_node = prev_node_tail
        node.next_node = self.tail
        self.tail.prev = node

    def get(self, node_key: int) -> int:
        if node_key not in self.hash_table:
            return -1
        node = self.hash_table[node_key]
        node_val = node.value
        self.remove_node(node)
        self.add_node(node)
        return node_val

    def put(self, node_key: int, new_val: int) -> None:
        if node_key in self.hash_table:
            node = self.hash_table[node_key]
            node.val = new_val
            self.remove_node(node)
            self.add_node(node)
        else:
            if self.current_size >= self.capacity:
                lru_node = self.head.next_node
                del self.hash_table[lru_node.node_key]
                self.remove_node(lru_node)
                self.current_size -= 1

            new_node = Node(node_key, new_val)
            self.hash_table[node_key] = new_node
            self.add_node(new_node)
            self.current_size += 1


So that's our simple LRU Cache. 😀

Comments

Popular posts from this blog

Python: Creational Design Patterns.

Creational Design Patterns. Creational Design Patterns are one of the simplest sets of Design Patterns. Advantages Object creation can be independent of class implementation. Adding support to a new type of object is very easy The logic of object creation is hidden . Creational Design Patterns Include 1) Singleton Design Pattern This Pattern states that at a given time there should be only one `instance` of any given object. Thus reusing the object over and over again. Singleton Object creation has only 2 sets of rules to be followed. 1) Object Must have only Single Instance Created. 2) There should be global access to the object created. In Python Singleton is easiest to create for any given class. Since the `__new__` is always a class method in python (until version 3.8). You can always modify this method to make your object as a Singleton Object. Advantages Only a single instance is created hence low memory usage. Faster Execution Since the Class is Not instantiated multiple times.

Python 4.0.0 Will it be released ??

Python 4.0.0 As of July 28, 2021, Python version 2.9.6 has been released.  One of the biggest tasks for most of the python community was when the last major shift happened i.e. Python 2 Projects were to be migrated to Python 3. Most of us found the whole new change in the python package's way of writing, and at most of the places, the handling of Unicode was a mess. So the big picture stands when will Python Version 4.0 be released? Will the next major version of python bring any major changes in old codes? And at last, will it create a mess again for migrations of our Old Projects? Well as per the latest guidelines on Python.org unlike Python 2.9 and then Python3 the shift to Python 4 won't be so early. As per  Guido van Rossum, creator of the popular programming language, Python 4 is still far away. Python-Dev Community would be working on the Python 3.11.0a0 version of Python which is expected to be released by the end of the year 2021. So no all the major fuss about Python