Java HashMap Internals: .put() Method Explanation

Internal Working of HashMap in Java — How .put() Works When the .put(key, value) method is called for the first time, HashMap internally performs the following steps: Step 1 — Array of Buckets is Created An array of 16 buckets (index 0 to 15) is initialized by default. Each bucket acts as a slot to store data. Step 2 — Hash Calculation The .put() method internally calls putVal(), which receives the result of hash(key) as an argument. The hash() method calls hashCode() which calculates a hash value by using key and returns it back to hash(), which then passes it to putVal(). Step 3 — Index Generation Inside putVal(), a bitwise AND operation is performed between the hashcode and (n-1) where n = 16: index = hashCode & (n - 1) This generates the exact bucket index where the key-value pair will be stored. Step 4 — Node Creation A Node is created and linked to the calculated bucket index. Each Node contains: 🔹 hash — the hash value 🔹 key — the key 🔹 value — the value 🔹 next — pointer to the next node What happens when .put() is called again? The same hash calculation process runs. Now two scenarios are possible: Case 1 — Same Index, Same Key (Update) If the newly generated index matches an existing bucket and the existing node's key equals the new key → the value is simply updated with the new one. Case 2 — Same Index, Different Key (Collision) If the index is the same but the keys are different → a collision occurs. In this case, the new node is linked to the previous node using the next pointer, forming a Linked List inside that bucket. That's how HashMap handles collisions and maintains data internally — simple but powerful. #Java #HashMap #DataStructures #BackendDevelopment #Programming #TechLearning #JavaDeveloper

  • No alternative text description for this image

Good breakdown, especially the bitwise index calculation part, that’s where a lot of people miss what’s really happening. One thing that becomes important in real systems is how collisions behave under load, especially before treeification kicks in. Poor hash distribution or bad key design can quietly degrade performance even if everything looks fine at small scale.

Like
Reply
See more comments

To view or add a comment, sign in

Explore content categories