Implementation of LRU Cache Using Python
What is a Cache?
A cache is a hardware or software component that stores data so that future requests for that data can be served faster. The data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere.
What Is An LRU Cache?
So an LRU Cache is storage of items so that future access to those items can be serviced quickly and an LRU policy is used for cache eviction.
The Constraints/Operations
Lookup of cache items must be O(1)
Addition to the cache must be O(1)
Our Structures
Doubly Linked List: This will hold the items that our cache has. We will have n items in the cache.
This structure satisfies the constraint for fast addition since any doubly linked list item can be added or removed in O(1) time with proper references.
Hashtable: The hashtable will give us fast access to any item in the doubly linked list items to avoid O(n) search for items and the LRU entry (which will always be at the tail of the doubly linked list).
This is a very common pattern, we use a structure to satisfy special insertion or remove properties (use a BST, linked list, etc.) and back it up with a hashtable so we do not re-traverse the structures every time to find elements.
Complexities & Time
Both get and put methods are O( 1 ) in time complexity.
Space
We use O( n ) space since we will store n items where n is the capacity of the cache.
Approach
We will be using the following Approach for Implementing LRU Cache.
1) Doubly Linked List:
Doubly Linked List is easy to manipulate since both the starting and the next position is present you can easily remove any node from between.
2) Hashing Table:
Hashing Table is used to maintain the data and check whether the value is present in Linked List or Not. For Hashing Table I'll be using Dictionary in Python.
Code
- Node Key -- Key through the cache value will be fetched/
- Value -- Value for the Key
- Next Node -- Object for the Next Node
- Previous Node -- Object for the Previous Node.
class Node:
def __init__(self, node_key, value):
self.node_key = node_key
self.value = value
self.next_node = None
self.prev_node = None
LRU Cache Code.
class LRUCache:
def __init__(self, capacity):
self.capacity = capacity
self.head = Node('head', 'head')
self.tail = Node('tail', 'tail')
self.head.next_node = self.tail
self.tail.prev_node = self.head
self.current_size = 0
self.hash_table = OrderedDict()
def remove_node(self, node):
node_after = node.next_node
node_before = node.prev_node
node_before.next_node = node_after
node_after.prev_node = node_before
def add_node(self, node):
prev_node_tail = self.tail.prev_node
prev_node_tail.next_node = node
node.prev_node = prev_node_tail
node.next_node = self.tail
self.tail.prev = node
def get(self, node_key: int) -> int:
if node_key not in self.hash_table:
return -1
node = self.hash_table[node_key]
node_val = node.value
self.remove_node(node)
self.add_node(node)
return node_val
def put(self, node_key: int, new_val: int) -> None:
if node_key in self.hash_table:
node = self.hash_table[node_key]
node.val = new_val
self.remove_node(node)
self.add_node(node)
else:
if self.current_size >= self.capacity:
lru_node = self.head.next_node
del self.hash_table[lru_node.node_key]
self.remove_node(lru_node)
self.current_size -= 1
new_node = Node(node_key, new_val)
self.hash_table[node_key] = new_node
self.add_node(new_node)
self.current_size += 1
So that's our simple LRU Cache. 😀
Comments