LRU (Least Recently Used) Cache | Algorithm & Implementation

LRU (Least Recently Used) Cache is a type of cache replacement algorithm, where the least recently used item is evicted when the cache is full. In other words, when the cache reaches its maximum capacity, the cache should remove the least recently (oldest) used item to make space for new items.
It works by keeping track of the most recently used and least recently used items in a doubly linked list. When an item inside the cache is accessed, it is moved to the front of the list. When the cache reaches its limit, the item at the end of the list is removed.

Note:
We’re using a doubly linked list for our cache instead of an array or singly linked list for the following reasons:
1- We couldn’t use an array because we need a data structure that allows us removal and insertion of elements in O(1) time complexity, without shifting any data structure member (item) after insertion and deletion.
2- We couldn’t use a singly linked list as we couldn’t remove an element from Linked List with the pointer to the node to be deleted. So, it would also require a traversal of the list to remove an element from the middle, making it inefficient for LRU cache implementation. (Although, there’s a way to delete a node from a Linked List using the pointer to the node, explained here. It is preferred to use a doubly linked list.)

Code Implementation

import java.util.HashMap;

class LRUCache {
    private int capacity;
    private HashMap<Integer, Node> map;
    private Node head;
    private Node tail;

    class Node {
        int key;
        int value;
        Node prev;
        Node next;

        public Node (int key, int value) {
            this.key = key;
            this.value = value;
        }
    }

    public LRUCache (int capacity) {
        this.capacity = capacity;
        this.map = new HashMap<>();
        this.head = new Node(0, 0); // represents NULL node
        this.tail = new Node(0, 0);
        head.next = tail;
        tail.prev = head;
    }

    public int get (int key) {
        Node node = map.get(key);

        if (node == null) {
            return -1;
        }

        remove(node);
        add(node);
        return node.value;
    }

    public void put (int key, int value) {
        Node node = map.get(key);

        if (node != null) {
            remove(node);
            node.value = value;
            add(node);
        } else {
            node = new Node(key, value);
            map.put(key, node);
            add(node);

            if (map.size() > capacity) {
                Node removed = tail.prev;
                remove(removed);
                map.remove(removed.key);
            }

        }
    }

    private void add (Node node) {
        Node headNext = head.next;
        head.next = node;
        node.prev = head;
        node.next = headNext;
        headNext.prev = node;
    }

    private void remove (Node node) {
        Node prev = node.prev;
        Node next = node.next;
        prev.next = next;
        next.prev = prev;
    }
    
    public String toString () {
        StringBuilder sb = new StringBuilder();
        Node current = head.next;
        while (current != tail) {
            sb.append(current.key).append(": ").append(current.value).append("\n");
            current = current.next;
        }

        return sb.toString();
    }
}

public class Main {
    public static void main (String[] args) {
        LRUCache cache = new LRUCache(2);

        cache.put(1, 100);
        cache.put(2, 200);
        System.out.println("Current Cache Items: \n" + cache.toString());

        cache.get(1);
        System.out.println("Current Cache Items: \n" + cache.toString());

        cache.put(3, 300);
        System.out.println("Current Cache Items: \n" + cache.toString());

        cache.get(2);
        System.out.println("Current Cache Items: \n" + cache.toString());

        cache.put(4, 400);
        System.out.println("Current Cache Items: \n" + cache.toString());
    }
}

Output

Current Cache Items: 
2: 200
1: 100

Current Cache Items: 
1: 100
2: 200

Current Cache Items: 
3: 300
1: 100

Current Cache Items: 
3: 300
1: 100

Current Cache Items: 
4: 400
3: 300

Note:
1- New node that is added, is added to the head of the cache (doubly linked list).
2- Cache Items are printed in the most recently used (first) order.

This Java implementation uses a HashMap to store the key-value pairs, and a Doubly Linked List to maintain the order of the cache items. The head of the linked list represents the most recently used item, and the tail represents the least recently used item.
In the get method, we first check if the key exists in the HashMap. If it does, we remove the node from its current position in the linked list, and add it to the head of the linked list to mark it as the most recently used item. Finally, we return the value of the node.
In the put method, we first check if the key already exists in the HashMap. If it does, we remove the node from its current position in the linked list, update its value, and add it to the head of the linked list. If the key does not exist in the HashMap, we create a new node. And add it to the head of the linked list & the HashMap. If the size of the HashMap exceeds the capacity, we remove the least recently used item (which is the tail of the linked list or the oldest node) from both the HashMap and the linked list.
This implementation has a time complexity of O(1) for both get and put operations, as all the operations involved in maintaining the cache are constant-time operations.

Code Implementation using Java LinkedList package

import java.util.HashMap;
import java.util.LinkedList;
import java.util.Map;

class LRUCache {

    private final int capacity;
    private final Map<Integer, Integer> map;
    private final LinkedList<Integer> list;

    public LRUCache (int capacity) {
        this.capacity = capacity;
        this.map = new HashMap<>();
        this.list = new LinkedList<>();
    }

    public int get (int key) {
        if (map.containsKey(key)) {
            list.remove(Integer.valueOf(key));
            list.addFirst(key);
            return map.get(key);
        } else {
            return -1;
        }
    }

    public void put (int key, int value) {

        if (map.containsKey(key)) {
            list.remove(Integer.valueOf(key));
        } else if (map.size() == capacity) {
            int last = list.removeLast();
            map.remove(last);
        }

        map.put(key, value);
        list.addFirst(key);
    }

    public void displayCache () {

        System.out.println("Current Cache Items:");
        for (Integer key : this.list) {
            System.out.println(key + ": " + this.map.get(key));
        }

    	System.out.print("\n");
    }
}


public class Main {

	    public static void main (String[] args) {
	        int capacity = 3;
	        LRUCache cache = new LRUCache(capacity);
	
	        cache.put(1, 100);
	        cache.put(2, 200);
	        
	        cache.displayCache();
	
	        cache.put(4, 400);
	        cache.displayCache();
	
	        cache.put(2, 500);
	        cache.displayCache();
	
	        cache.put(5, 600);
	        cache.displayCache();
        }

}

Output

Current Cache Items:
2: 200
1: 100

Current Cache Items:
4: 400
2: 200
1: 100

Current Cache Items:
2: 500
4: 400
1: 100

Current Cache Items:
5: 600
2: 500
4: 400

Reference: ChatGPT

Leave a Reply

Your email address will not be published. Required fields are marked *