Which Data Structure Is Used for Implementing LRU Cache in Java?

//

Larry Thompson

Which Data Structure Is Used for Implementing LRU Cache in Java?

If you are working with large amounts of data in your Java application and need to efficiently manage cache, the Least Recently Used (LRU) cache algorithm can be a great solution. LRU cache is designed to store a limited number of items and remove the least recently used item when the cache is full.

But what data structure should you use to implement this algorithm? Let’s explore some options.

LinkedHashMap

The LinkedHashMap class in Java provides a built-in implementation of an LRU cache. It extends the HashMap class and maintains a doubly-linked list of entries, which allows access order iteration. This means that when you access or modify an entry, it gets moved to the end of the linked list, making it the most recently used item.

To create an LRU cache using LinkedHashMap, you need to override its removeEldestEntry() method. This method is called after every put operation and determines whether the eldest (least recently used) entry should be removed or not. By default, this method returns false, which means that entries are never removed automatically.

To create an LRU cache with a maximum capacity of 100:


Map<String, String> lruCache = new LinkedHashMap<>(100, 0.75F, true) {
    @Override
    protected boolean removeEldestEntry(Map.Entry<String, String> eldest) {
        return size() > 100;
    }
};

Doubly Linked List + HashMap

If you prefer more control over your data structure implementation or if you are working on a project where you can’t use the LinkedHashMap class, you can implement an LRU cache using a combination of a doubly linked list and a HashMap.

The doubly linked list will store the actual data items, while the HashMap will provide fast access to those items. Each node in the doubly linked list will represent an entry in the cache. The head of the list will be the least recently used item, and the tail will be the most recently used item.

Here is a basic implementation:


class Node {
    String key;
    String value;
    Node prev;
    Node next;

    public Node(String key, String value) {
        this.key = key;
        this.value = value;
    }
}

class LRUCache {
    private final int capacity;
    private final Map<String, Node> cache;
    private Node head;
    private Node tail;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        cache = new HashMap<>();
        head = new Node(null, null);
        tail = new Node(null, null);
        head.next = tail;
        tail.prev = head;
    }

    public void put(String key, String value) {
        // Implementation details omitted for brevity
    }

    public String get(String key) {
        // Implementation details omitted for brevity
        return null;
    }
}

Conclusion

Implementing an LRU cache in Java requires careful consideration of which data structure to use. The LinkedHashMap provides a convenient built-in solution that handles most of the complexities for you. However, if you need more control or cannot use LinkedHashMap for some reason, implementing your own LRU cache using a combination of a doubly linked list and a HashMap can be a suitable alternative.

Remember, the choice of data structure depends on your specific requirements and constraints. Evaluate the trade-offs between ease of use, performance, and maintainability before making a decision.

Now that you understand the options available for implementing an LRU cache in Java, you can choose the one that best suits your needs and efficiently manage cache in your applications.

Discord Server - Web Server - Private Server - DNS Server - Object-Oriented Programming - Scripting - Data Types - Data Structures

Privacy Policy