Hello Everyone, Day 19 / #100DaysOfCode LeetCode 1292 — Maximum Side Length of a Square with Sum ≤ Threshold (Medium) Problem (brief): Given an m × n matrix of non-negative integers and a threshold, find the maximum side length of a square submatrix whose sum is ≤ threshold. Why this was tricky (for me): Matrix problems still slow me down. Not because they’re impossible—but because without the right preprocessing, everything turns into brute force chaos. Once again, the editorial clarified the first key idea. Core Idea: Use a 2D prefix sum matrix to compute the sum of any square in O(1) time. Let P[i][j] store the sum of the submatrix (1,1) → (i,j). Then any square sum can be computed as: sum = P[x2][y2] - P[x1-1][y2] - P[x2][y1-1] + P[x1-1][y1-1] Optimized Enumeration Strategy: Naively, we’d use three nested loops: - Top-left corner (i, j) - Side length c But two important optimizations help: 1. Monotonicity: All values are non-negative. If a square of size c exceeds the threshold, any larger square at the same position will also fail → break early. 2. Global Best Tracking: If we’ve already found a valid square of size ans, there’s no point checking sizes ≤ ans again. Start directly from ans + 1. These two observations significantly reduce unnecessary checks. Complexity: - Time: ~ O(m⋅n⋅min(m,n))O(m · n · min(m,n))O(m⋅n⋅min(m,n)) (with strong pruning) - Space: O(m⋅n)O(m · n)O(m⋅n) for prefix sums Takeaway: This wasn’t about inventing a clever trick—it was about recognizing when prefix sums + pruning turn brute force into something practical. Matrix problems are still uncomfortable for me, but at least now I know why the solution works—not just that it works. #Leetcode #DSA #Java
Max Square Side Length in Matrix with Sum ≤ Threshold
More Relevant Posts
-
Day 33 / #100DaysOfCode LeetCode 3013 — Divide an Array Into Subarrays With Minimum Cost II (Hard) Problem (short): Given an array nums, divide it into k subarrays such that: - The first subarray must start at index 0 - The distance between the start of the 2nd and kth subarray is at most dist - The total cost (sum of starting elements of all subarrays) is minimized This is the Hard follow-up of Part I, and the constraints make brute force impossible. Key Challenge: Once k becomes variable, we are no longer choosing just 2 elements. We must dynamically maintain the minimum sum of (k-2) valid starting elements inside a moving window. This is where simple sorting or greedy breaks. Core Idea / Observation: - nums[0] is always included - For each possible ending index, we need: • the smallest (k-2) elements from a sliding window • fast insertion and deletion • fast access to the smallest elements This naturally leads to a two-multiset (two TreeMap) strategy. Approach: 1. Fix nums[0] as the first subarray 2. Use two TreeMaps: • st1: stores the smallest (k-2) elements (contributes to sum) • st2: stores the remaining elements 3. Maintain balance using: • automatic shifting between maps • invariant: st1 always contains exactly (k-2) smallest elements 4. Slide the window while respecting the dist constraint 5. At each step: • compute current cost = nums[0] + sum(st1) + current element • update minimum I did take help from the editorial here — to understand how to maintain the invariant correctly. This problem is more about data-structure discipline than raw logic. Complexity: - Time: O(n log n) - Space: O(k) (due to TreeMaps) #Leetcode #DSA #Java
To view or add a comment, sign in
-
-
🚨 Your favourite collection might be your biggest bottleneck. We’ve all been there — we start out with ArrayList. It’s simple, fast, and works well for most cases. But as your application grows, that one-size-fits-all approach quietly becomes the reason for mysterious slowdowns. The Java Collections Framework (JCF) isn’t a buffet of similar options — it’s a power toolset designed for specific trade-offs. Knowing which one to pick can make your system faster and your code cleaner. Here’s a quick reality check 👇 ArrayList ➡️ Great for fast random access and cheap appends; but costly middle inserts or deletes. LinkedList ➡️ Best for frequent head/tail updates; but worse cache locality and higher memory cost. HashMap / HashSet ➡️ O(1) lookups; but no ordering, and hash collisions can bite you if keys aren’t well-designed. LinkedHashMap ➡️ Preserves insertion order — perfect for caches or predictable APIs. TreeMap / TreeSet ➡️ Keeps keys sorted; excellent when order matters more than speed. ⚙️ A few pro tips: Use ArrayList for “read-mostly” lists. If you’re iterating more than mutating, it’s your go-to. Avoid LinkedList unless you really need it. In most modern JVMs, it’s slower due to poor cache performance. Use EnumMap or EnumSet when keys are enums — highly memory and speed efficient. Initialize collection sizes upfront. Helps avoid resizing overhead, especially in tight loops. Prefer immutability — List.of() and Map.of() are your friends for static data. Choosing the right collection isn’t a micro-optimization — it’s a design decision that compounds as your system scales. So, what do you think? 👉 Which Java collection is overused or misused most in your experience? #Java #CollectionsFramework #DataStructures #Performance #SoftwareEngineering #CleanCode
To view or add a comment, sign in
-
Day 38: Efficiency through String Balancing! ⚖️ Problem 1653: Minimum Deletions to Make String Balanced Today’s challenge was to find the minimum deletions needed to ensure no 'b' comes before an 'a' in a string. It’s a classic problem that tests your ability to think about state transitions across a sequence. I implemented a prefix/suffix counting strategy. Instead of brute-forcing every deletion possibility, I pre-calculated the total count of 'a's and then iterated through the string while keeping a running count of 'b's. At each position, the sum of 'b's seen so far (to be deleted if they stay) and 'a's remaining (to be deleted if they appear later) gave me the cost to balance at that specific "split point." By tracking the minimum sum across all possible split points, I achieved a clean O(n) solution with just two passes. It’s a great reminder that sometimes "balancing" a problem is just about counting what’s on either side of the fence! #LeetCode #Java #StringAlgorithms #Optimization #CodingChallenge #ProblemSolving #Algorithms
To view or add a comment, sign in
-
-
𝗗𝗮𝘆 𝟯𝟭/𝟲𝟬: 𝗪𝗵𝘆 𝗔𝗿𝗿𝗮𝘆𝘀 𝗙𝗮𝗶𝗹 𝗮𝘁 𝗦𝗰𝗮𝗹𝗲 🔗 Most developers stick to arrays because they are easy. But in high-frequency backend systems, "easy" can be expensive. If you need to insert 1,000 records into the middle of an array, you have to shift 1,000 elements. That is O(n) waste. Today, I’m moving into the second half of my 60-day challenge by mastering linked lists. 𝗧𝗵𝗲 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗮𝗹 𝗦𝗵𝗶𝗳𝘁: Arrays are contiguous. Linked lists are distributed. By using nodes and references, we trade "random access" for "infinite flexibility". 𝗪𝗵𝘆 this matters in a Java environment: O(1) Efficiency: Inserting at the head is instant, regardless of list size. No Memory Resizing: Unlike ArrayList, we don't have to "copy and double" the capacity. Heap Mastery: It’s the ultimate way to understand how the JVM actually stores objects. 𝗧𝗵𝗲 𝗥𝗲𝗮𝗹-𝗪𝗼𝗿𝗹𝗱 𝗖𝗵𝗮𝗹𝗹𝗲𝗻𝗴𝗲: The hardest part isn't the code—it's the "mental pointer". In an array, each element has an index. In a list, you have a reference. One wrong pointer assignment today taught me exactly how "memory leaks" happen. If you lose the reference to a node before relinking it, it's gone. 𝗠𝗶𝗹𝗲𝘀𝘁𝗼𝗻𝗲 𝗦𝘁𝗮𝘁𝘂𝘀: ✅ Month 1: Patterns (Sliding Window, Two-Pointer, Prefix Sum). #Java #BackendEngineering #60DaysOfCode #DSA #SystemDesign #SoftwareEngineer #LinkedList #Programming
To view or add a comment, sign in
-
🚀 Day 9 of #100DaysOfCode Solved Search in Rotated Sorted Array II on LeetCode using Modified Binary Search (Handling Duplicates) 🔎 🧠 Key insight: With duplicates present, it’s sometimes impossible to identify the sorted half directly. If nums[left] == nums[mid], we safely move left++ to shrink the search space. ⚙️ Approach: 🔹Apply binary search 🔹Handle duplicates by skipping equal boundary values 🔹Identify sorted half and check if target lies within it 🔹Continue search accordingly ⏱️ Time Complexity: O(log n) average, O(n) worst case (due to duplicates) 📦 Space Complexity: O(1) #100DaysOfCode #LeetCode #DSA #BinarySearch #Java #ProblemSolving #LearningInPublic #CodingJourney
To view or add a comment, sign in
-
-
⚡ 𝗗𝗮𝘆 𝟴𝟳 𝗼𝗳 𝗠𝘆 𝟭𝟬𝟬 𝗗𝗮𝘆𝘀 𝗼𝗳 𝗗𝗦𝗔 𝗖𝗵𝗮𝗹𝗹𝗲𝗻𝗴𝗲! 𝘛𝘰𝘥𝘢𝘺’𝘴 𝘱𝘳𝘰𝘣𝘭𝘦𝘮 𝘸𝘢𝘴 𝘢 𝘨𝘳𝘦𝘢𝘵 𝘳𝘦𝘮𝘪𝘯𝘥𝘦𝘳 𝘵𝘩𝘢𝘵 𝘤𝘰𝘯𝘴𝘵𝘳𝘢𝘪𝘯𝘵𝘴 𝘰𝘧𝘵𝘦𝘯 𝘥𝘦𝘧𝘪𝘯𝘦 𝘵𝘩𝘦 𝘳𝘦𝘢𝘭 𝘤𝘩𝘢𝘭𝘭𝘦𝘯𝘨𝘦. 𝙏𝙝𝙞𝙣𝙠𝙞𝙣𝙜 𝙗𝙚𝙮𝙤𝙣𝙙 𝙗𝙪𝙞𝙡𝙩-𝙞𝙣𝙨! 📌 Problem Solved: 1️⃣ 𝗟𝗲𝗲𝘁𝗰𝗼𝗱𝗲 𝟯𝟲𝟳: 𝗩𝗮𝗹𝗶𝗱 𝗣𝗲𝗿𝗳𝗲𝗰𝘁 𝗦𝗾𝘂𝗮𝗿𝗲 (𝗘𝗮𝘀𝘆) ➡️ Without using built-in functions like sqrt(). ✨ Key Learnings: 🔹 Since using 𝘀𝗾𝗿𝘁() 𝘄𝗮𝘀 𝗻𝗼𝘁 𝗮𝗹𝗹𝗼𝘄𝗲𝗱, the solution required logical thinking instead of shortcuts. 🔹 A clean approach is using 𝗕𝗶𝗻𝗮𝗿𝘆 𝗦𝗲𝗮𝗿𝗰𝗵 between 1 and num, checking whether mid * mid == num. 🔹 This reduces 𝘁𝗶𝗺𝗲 𝗰𝗼𝗺𝗽𝗹𝗲𝘅𝗶𝘁𝘆 to: ✅ 𝗢(𝗹𝗼𝗴 𝗻) 𝗶𝗻𝘀𝘁𝗲𝗮𝗱 𝗼𝗳 𝗢(𝗻) 🔹 Important to 𝗵𝗮𝗻𝗱𝗹𝗲 𝗶𝗻𝘁𝗲𝗴𝗲𝗿 𝗼𝘃𝗲𝗿𝗳𝗹𝗼𝘄 while calculating mid * mid (use long if needed). 🧠 Big Takeaway: When shortcuts are removed, 𝗮𝗹𝗴𝗼𝗿𝗶𝘁𝗵𝗺𝗶𝗰 𝘁𝗵𝗶𝗻𝗸𝗶𝗻𝗴 takes over. 𝗕𝗶𝗻𝗮𝗿𝘆 𝗦𝗲𝗮𝗿𝗰𝗵 isn’t just for arrays — it applies anywhere there’s a 𝘀𝗲𝗮𝗿𝗰𝗵 𝘀𝗽𝗮𝗰𝗲. Day 87 completed — 𝘁𝗵𝗶𝗻𝗸𝗶𝗻𝗴 𝗯𝗲𝘆𝗼𝗻𝗱 𝗯𝘂𝗶𝗹𝘁-𝗶𝗻𝘀! 💪🔥 #100DaysOfCode #DSA #Java #LeetCode #BinarySearch #ProblemSolving #InterviewPreparation #LearningInPublic #Developers #MathLogic #ProblemSolving #NumberTheory #InterviewPreparation #LearningInPublic #Developers
To view or add a comment, sign in
-
-
🚀 #100DaysOfCode – Day 12 Solved Search in Rotated Sorted Array II on LeetCode 🔄🔍 🧠 Key insight: When duplicates are present, it’s not always possible to directly identify the sorted half. If nums[left] == nums[mid], we can safely move left++ to reduce the search space and continue. ⚙️ Approach: 🔹Use modified binary search 🔹Handle duplicates by skipping equal boundary elements 🔹Identify the sorted half 🔹Check if the target lies in that range and adjust pointers accordingly ⏱️ Time Complexity: Average: O(log n) Worst case (many duplicates): O(n) 📦 Space Complexity: O(1) #100DaysOfCode #LeetCode #DSA #BinarySearch #Java #ProblemSolving #LearningInPublic #CodingJourney
To view or add a comment, sign in
-
-
Day 10/100 Q 12 – Remove Duplicates from Sorted Array (DSA) Today’s problem was all about in-place array manipulation and two-pointer technique. 📌 Problem Statement Given a sorted integer array, remove duplicates in-place so that each unique element appears only once and return the count of unique elements. 🧠 Key Insight Since the array is already sorted, duplicates appear next to each other. Using two pointers, we can overwrite duplicates and keep all unique elements at the beginning of the array without using extra space. ⚙️ Approach Use one pointer to track the last unique element Traverse the array with another pointer Update the array when a new unique element is found ⏱ Complexity Time: O(n) Space: O(1) 💡 What I Learned How sorting simplifies duplicate problems Practical use of the two-pointer technique Writing space-optimized solutions 🔁 On to the next challenge! Consistency > Motivation 💪 #100DaysOfDSA #DSA #Arrays #TwoPointers #Java #ProblemSolving #LeetCode #CodingJourney #DailyLearning
To view or add a comment, sign in
-
-
🚀 Day 67 of #100DaysOfCode Today’s problem focused on 2D Prefix Sum optimization — Range Sum Query 2D – Immutable (NumMatrix) 📊 At first glance, this problem looks like repeated matrix traversal, but the real win comes from preprocessing smartly. 📌 Problem Summary You’re given a 2D matrix and multiple queries asking for the sum of elements inside a sub-rectangle. Brute force would be too slow for repeated queries. 🧠 My Approach: Prefix Sum (Row-wise) Precompute prefix sums for each row For every query, calculate the sum in O(rows) time using prefix subtraction Avoid recalculating sums for every query This transforms repeated heavy computation into a fast query process ⚡ ⚙️ Time & Space Complexity Preprocessing: O(rows × cols) Each Query: O(rows) Space: O(rows × cols) 🔥 Key Learning Prefix sums are not limited to 1D arrays— they scale beautifully to 2D problems and are extremely powerful for range queries. Another solid step forward in mastering optimization techniques 💪 On to Day 68 🚀 #Day67 #100DaysOfCode #Java #LeetCode #PrefixSum #DSA #ProblemSolving #CodingJourney
To view or add a comment, sign in
-
-
🔥 Day 83 of #100DaysOfCode Today’s problem: LeetCode: Minimum Size Subarray Sum 🎯 📌 Problem Summary Given: An integer target An array nums Return the minimum length of a contiguous subarray whose sum is greater than or equal to target. If no such subarray exists → return 0. Example: target = 7 nums = [2,3,1,2,4,3] Output → 2 (because [4,3] = 7) 🧠 Approach: Sliding Window (Two Pointers) This is a classic variable-size sliding window problem. ⚙️ Strategy: Use two pointers: l (left) and r (right) Expand window by moving r Keep adding to total When total >= target: Update result Shrink window from left Subtract from total 🔁 Core Logic: for each r: total += nums[r] while total >= target: update min length total -= nums[l] l++ ⏱ Time Complexity: O(n) 💾 Space Complexity: O(1) 💡 What I Learned Sliding window is extremely powerful for subarray problems. The key is knowing when to expand and when to shrink. Many medium/hard problems are just variations of this pattern. Today’s runtime: ⚡ 1ms (99% faster) Consistency builds mastery. On to Day 84 🚀 #100DaysOfCode #LeetCode #SlidingWindow #Java #DSA #InterviewPrep #CodingJourney
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development