Implementing LRU Cache in Java with LinkedHashMap

🚀 LinkedHashMap = Built-in LRU Cache in Java If you’ve ever implemented an LRU Cache in interviews, this is your shortcut 👇 What is it? LinkedHashMap maintains insertion order (or access order) using a doubly linked list + hash table. With a small tweak, it behaves exactly like an LRU Cache. Why use it? • No need to manually manage DLL + HashMap • O(1) get & put operations • Clean and interview-friendly implementation How it works (LRU mode) Set accessOrder = true → recently accessed items move to the end Example 👇 import java.util.*; class LRUCache<K, V> extends LinkedHashMap<K, V> { private final int capacity; public LRUCache(int capacity) { super(capacity, 0.75f, true); // accessOrder = true this.capacity = capacity; } @Override protected boolean removeEldestEntry(Map.Entry<K, V> eldest) { return size() > capacity; // remove least recently used } } Usage 👇 LRUCache<Integer, String> cache = new LRUCache<>(3); cache.put(1, "A"); cache.put(2, "B"); cache.put(3, "C"); cache.get(1); // access → 1 becomes recent cache.put(4, "D"); // removes key 2 (LRU) Flow 🧠 1️⃣ Insert → goes to end 2️⃣ Access → moves to end 3️⃣ Capacity full → remove from start (LRU) Result Efficient LRU cache in just ~10 lines of code ✅ Rule of Thumb 👉 If interviewer asks LRU → First say DLL + HashMap, then optimize using LinkedHashMap 👉 If you are preparing for Java backend interviews, connect & follow - I share short, practical backend concepts regularly. #Java #DSA #BackendDevelopment #SystemDesign #CodingInterview

  • No alternative text description for this image

To view or add a comment, sign in

Explore content categories