Turning a confusing problem into an optimized solution 🚀 Today, I worked on a string problem: Longest Repeating Character Replacement Example: "AABABBA", k = 1 Output = 4 (AAAA, BBBB) Instead of directly jumping to the optimized solution, I explored multiple approaches: 🔹 Brute Force Approach Try all possible substrings Count frequency of characters Check if valid using: length - maxFreq <= k Time Complexity: O(n³) 🔹 Optimized Approach (Sliding Window) Use two pointers (left & right) Expand the window by adding characters Shrink the window when it becomes invalid Reuse previous computations instead of restarting Time Complexity: O(n) Here’s my implementation in JavaScript: function longestSubstringSame(s, k) { let map = {}; let left = 0; let maxFreq = 0; let maxLen = 0; for (let right = 0; right < s.length; right++) { map[s[right]] = (map[s[right]] || 0) + 1; maxFreq = Math.max(maxFreq, map[s[right]]); if ((right - left + 1) - maxFreq > k) { map[s[left]]--; left++; } maxLen = Math.max(maxLen, right - left + 1); } return maxLen; } console.log(longestSubstringSame("AABABBA", 1)); 💡 Key Takeaways: Don’t recompute everything from scratch Sliding Window helps optimize repeated work Understanding the logic behind conditions is crucial Currently improving my skills in Data Structures & Algorithms and building a strong problem-solving mindset. Open to feedback and suggestions! #DSA #JavaScript #ProblemSolving #SoftwareDevelopment #LearningInPublic #AccioJob
Optimizing String Replacement with Sliding Window Approach
More Relevant Posts
-
Recently, I was building an image extraction tool and ran into a challenge that many of us face with modern websites The Problem Today’s websites rely heavily on JavaScript, so a lot of content loads dynamically. Because of that, the usual scraping methods (like simple HTTP requests + HTML parsing) often miss the actual data. What I Did To handle this, I started using Selenium to simulate a real browser. This way, the page loads just like it would for a user, and I could access the actual content. But that was only part of the solution. Once I had the data, there was a lot of noise icons, placeholders, UI elements things I didn’t really need. So I improved the filtering logic and focused on specific URL patterns to extract only useful, high-quality images. The Result • Cleaner and more relevant image data • Better handling of dynamic content • A more reliable extraction process Would love to hear from you: How do you handle scraping from dynamic websites or dealing with protected media? #WebScraping #Automation #Python #DataEngineering
To view or add a comment, sign in
-
Every developer who has built a web scraper knows this pain: Your scraper works perfectly. Then the website moves one <div>. And everything breaks. Welcome to the endless loop of fixing selectors. A new Python tool called Scrapling is getting a lot of attention for exactly this reason. Instead of relying only on CSS selectors… it “remembers” the element. Scrapling stores a fingerprint of the element its tag, attributes, neighbors, and structure. So when a website layout changes… it can relocate the element automatically. Meaning your scraper doesn’t instantly explode every time a site sneezes. It also packs some surprisingly powerful features: – Stealth fetchers to avoid bot detection – Built-in proxy rotation – Async spiders for large crawls – Browser fetchers when JavaScript rendering is needed Basically: BeautifulSoup simplicity Scrapy style crawling Playwright level dynamic fetching All in one library. That’s why developers are suddenly paying attention. Because most scraping projects don’t fail from scale… They fail from maintenance. The real cost of scraping isn’t writing the scraper. It’s fixing it every time the page changes. Tools like this shift scraping from: “babysitting fragile scripts” to “running resilient data pipelines.” Curious how many data teams are still maintaining broken scrapers every week. How do you handle scraper maintenance today? #Python #WebScraping #DataEngineering #AI #OpenSource
To view or add a comment, sign in
-
-
Turning a simple problem into an optimized solution 🚀 Today, I worked on a string problem: Count the frequency of each character in a string. Example: "geeksforgeeks" → [[g,2], [e,4], [k,2], [s,2]] Instead of directly jumping to code, I explored multiple approaches: 🔹 Brute Force Approach - Count each character separately - Time Complexity: O(n²) 🔹 Optimized Approach (HashMap) - Traverse the string once - Store frequency in a map - Time Complexity: O(n) Here’s my implementation in JavaScript: function countDuplicates(str) { const freq = {}; for (let char of str) { freq[char] = (freq[char] || 0) + 1; } return Object.entries(freq) .map(([key, value]) => key + value) .join(""); } console.log(countDuplicates("geeksforgeeks")); 💡 Key Takeaways: - Choosing the right data structure improves efficiency - Understanding time complexity is crucial in problem solving - Always try to optimize after getting a working solution Currently improving my skills in Data Structures & Algorithms and building a strong problem-solving mindset. Open to feedback and suggestions! #DSA #JavaScript #ProblemSolving #SoftwareDevelopment #LearningInPublic #Acciojob
To view or add a comment, sign in
-
Ever changed a variable in JavaScript only to realize you accidentally broke the original data too? 🤦♂️ That’s the classic Shallow vs. Deep Copy trap. Here is the "too long; didn't read" version: 1. Shallow Copy (The Surface Level) When you use the spread operator [...arr] or {...obj}, you’re only copying the top layer. The catch: If there are objects or arrays inside that object, they are still linked to the original. Use it for: Simple, flat data. 2. Deep Copy (The Full Clone) This creates a 100% independent copy of everything, no matter how deep the nesting goes. The easy way: const copy = structuredClone(original); The old way: JSON.parse(JSON.stringify(obj)); (Works, but it’s buggy with dates and functions). The Rule of Thumb: If your object has "layers" (objects inside objects), go with a Deep Copy. If it’s just a basic list or object, a Shallow Copy is faster and cleaner. Keep your data immutable and your hair un-pulled. ✌️ #Javascript #WebDev #Coding #ProgrammingTips
To view or add a comment, sign in
-
-
I recently implemented my first Linked List from scratch in JavaScript — no libraries, just raw pointer manipulation. This wasn’t just about writing code; it forced me to understand how data structures actually work under the hood. Here’s what I built: A custom Node structure with value and next pointers Core operations: addAtHead addAtTail addAtIndex deleteAtIndex get What made this interesting wasn’t the syntax — it was the edge cases and pointer management: Handling insertions at head vs tail Maintaining consistent length Avoiding broken references during insertion/deletion Fixing off-by-one errors in traversal Biggest takeaway: Linked Lists are simple in theory, but brutally unforgiving if your pointer logic is even slightly off. This exercise sharpened how I think about: Data structure integrity Boundary conditions Writing predictable, bug-resistant logic Still refining and testing edge cases — but this was a solid step forward. #JavaScript #DataStructures #LinkedList #Coding #SoftwareEngineering #LearningInPublic
To view or add a comment, sign in
-
-
Problem: Design HashMap My Approach (Simple Implementation 🛠️): This problem focuses on designing a HashMap from scratch without using built-in map structures I implemented it using a JavaScript object {} 👉 put(key, value) → Insert or update key-value pair 👉 get(key) → Return value if key exists, else -1 👉 remove(key) → Delete the key if present Core Logic: Store data in an object Use direct key access for O(1) operations Use delete to remove entries Complexity: put: O(1) get: O(1) remove: O(1) What can be improved? 👀 👉 This relies on built-in hashing 👉 A deeper implementation would involve: Buckets (array of lists) Handling collisions Custom hash functions Learning takeaway: Understanding how maps work internally is crucial #LeetCode #DSA #HashMap #DataStructures #Algorithms #JavaScript #ProblemSolving #CodingJourney #LearnInPublic
To view or add a comment, sign in
-
-
Here are your 37 topics organized by section: Execution, Scope & Memory 1. Execution context & call stack 2. var, let, const (scope + hoisting + TDZ) 3. Lexical scope & scope chain 4. Closures (behavior, not definition) 5. Shadowing & illegal shadowing 6. Garbage collection basics & memory leaks Functions & this 7. Function declarations vs expressions 8. this binding rules (default, implicit, explicit, new) 9. call, apply, bind 10. Arrow functions vs normal functions 11. Currying & partial application 12. Higher-order functions Async JavaScript 13. Event loop (call stack, microtasks, task queue) 14. Promises & chaining 15. async / await (error handling & sequencing) 16. Race conditions & stale closures 17. Timers (setTimeout, setInterval) vs microtasks 18. Promise utilities (all, allSettled, race, any) Data, References & ES6+ 19. == vs ===, truthy / falsy & type coercion deep dive 20. Object & array reference behavior 21. Deep vs shallow copy 22. Destructuring, rest & spread 23. Map, Set, WeakMap, WeakSet Prototypes & OOP 24. Prototype chain & Object.create() 25. class syntax vs prototype under the hood 26. Inheritance patterns Error Handling 27. try/catch with async/await edge cases 28. Custom error types 29. Unhandled promise rejections Modules 30. ES Modules (import/export) vs CommonJS (require) 31. Tree shaking concept 32. Dynamic imports — import() Iterators & Generators 33. Symbol.iterator & iterable protocol 34. Generator functions (function*) 35. Connecting generators to RxJS mental model Browser & Runtime Fundamentals 36. Event bubbling, capturing, delegation, preventDefault vs stopPropagation 37. DOM vs Virtual DOM, Reflow vs repaint, Web storage, Polyfills #angular #javascript #html #css #webdeveloper #angularDeveloper
To view or add a comment, sign in
-
Pure functions improve testability and composability. ────────────────────────────── JSON.parse and JSON.stringify Pure functions improve testability and composability. #javascript #json #serialization #data ────────────────────────────── Key Rules • Avoid mutating shared objects inside utility functions. • Write small focused functions with clear input-output behavior. • Use const by default and let when reassignment is needed. 💡 Try This const nums = [1, 2, 3, 4]; const evens = nums.filter((n) => n % 2 === 0); console.log(evens); ❓ Quick Quiz Q: What is the practical difference between let and const? A: Both are block-scoped; const prevents reassignment of the binding. 🔑 Key Takeaway Modern JavaScript is clearer and safer with immutable-first patterns. ────────────────────────────── Small JavaScript bugs keep escaping to production and breaking critical user flows. Debugging inconsistent runtime behavior steals time from feature delivery.
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development