🚀 Understanding Recursive Traversal: Why Is It Synchronous and Blocking? 🤔 When we talk about recursive traversal—whether it's navigating trees, graphs, or other data structures—it's important to recognize why this process is inherently synchronous and blocking. 🔍 Here’s the gist: Recursive calls happen on a single call stack, and each call waits for its deeper calls to return before continuing. This linear, step-by-step process ensures order but also means no parallel execution. As a result, recursive traversal blocks the thread until all sub-calls complete, making it inherently synchronous. This is just how recursion operates by nature—it’s a control flow mechanism, not a concurrency technique. React’s reconciliation algorithm in earlier versions traversed its virtual DOM tree synchronously using recursive, call stack–based traversal — meaning the whole render and commit process blocked the main thread until done. This recursive depth-first traversal blocks the browser’s main thread, causing UI jank on large trees. There was no built-in mechanism to pause or yield this traversal, making it blocking and synchronous by design. Have you experienced blocking recursive code before? Share your stories below! 💬 #ReactJS #JavaScript #FrontendDevelopment #SoftwareEngineering #RecursiveAlgorithms #CallStack #ReactFiber #AsyncProgramming #WebPerformance #DeveloperExperience #CodeOptimization #TechInsights #Programming #OpenSource
Recursive Traversal: Synchronous and Blocking Process
More Relevant Posts
-
📌 #61 DailyLeetCodeDose Today's problem: 146. LRU Cache – 🟡 Medium This task looks simple until you remember the O(1) requirement. A map alone is not enough – we also need to track recency. The key idea is combining a hash map with a doubly linked list: the map gives instant access, the list keeps items ordered by usage. Every get or put moves the node to the front, making it the most recently used. When capacity is exceeded, the last node is removed – that's the true LRU. Clean design, strict O(1), and a great reminder that the right data structure is often the whole solution. https://lnkd.in/epVtDYKH #DailyLeetCodeDose #LeetCode #JavaScript #Algorithms #ProblemSolving #Coding
To view or add a comment, sign in
-
-
🚀 𝗡𝗲𝘄 𝗣𝗥 𝗠𝗲𝗿𝗴𝗲𝗱 Just tackled a fun logical challenge: finding the intersection of two arrays. The goal was to identify elements present in both input arrays. I approached this using JavaScript. My strategy involved iterating through the first array and checking for the existence of each element in the second array. To optimize this lookup, I leveraged a Set data structure, which provides average O(1) time complexity for checking membership. During the 🐞 𝗗𝗲𝗯𝘂𝗴𝗴𝗶𝗻𝗴 𝗣𝗿𝗼𝗰𝗲𝘀𝘀, I found dry runs and visualizing the data flow particularly helpful. Stepping through the code with a debugger allowed me to pinpoint exactly where my logic was diverging from the expected output. A 📚 𝗞𝗲𝘆 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 for me was the significant performance improvement gained by using a Set for lookups compared to nested loops or Array.prototype.includes within a loop. Check out the implementation and contribute to the discussion here: https://lnkd.in/dvQbUFGK How do you typically ⚙️ 𝗔𝗽𝗽𝗿𝗼𝗮𝗰𝗵 array intersection problems? 📦 Repo: https://lnkd.in/dvQbUFGK #Algorithm #JavaScript #ProblemSolving #DataStructures #Set #CodingChallenge #Developer #Tech #InterviewQuestion #LogicalThinking
To view or add a comment, sign in
-
-
𝐈𝐟 𝐲𝐨𝐮𝐫 𝐓𝐲𝐩𝐞𝐒𝐜𝐫𝐢𝐩𝐭 𝐠𝐞𝐧𝐞𝐫𝐢𝐜𝐬 𝐟𝐞𝐞𝐥 𝐥𝐢𝐤𝐞 𝐠𝐥𝐨𝐫𝐢𝐟𝐢𝐞𝐝 𝐚𝐧𝐲𝐬, 𝐲𝐨𝐮'𝐫𝐞 𝐦𝐢𝐬𝐬𝐢𝐧𝐠 𝐚 𝐭𝐫𝐢𝐜𝐤. Writing reusable functions with generics is powerful, but leaving your type parameters too broad can defeat the purpose of TypeScript. How often do you find yourself writing obj[key as any] or obj[key as keyof T] to appease the compiler, feeling like you've lost type safety? The fix is often simple: constrain your generic type parameter to enforce type safety at compile time. Instead of: ```typescript function getPropertyBad<T>(obj: T, key: string) { return obj[key as keyof T]; // 'as keyof T' is an assertion, not a guarantee } ``` Do this: ```typescript function getPropertyGood<T, K extends keyof T>(obj: T, key: K): T[K] { return obj[key]; // Type-safe! K is guaranteed to be a key of T } interface Product { id: string; name: string; price: number; } const product: Product = { id: 'p1', name: 'Widget', price: 29.99 }; const productName = getPropertyGood(product, 'name'); // productName is string const productPrice = getPropertyGood(product, 'price'); // productPrice is number // getPropertyGood(product, 'description'); // Argument of type '"description"' is not assignable to parameter of type '"id" | "name" | "price"'. // The compiler catches typos or non-existent keys immediately! ``` This pattern is a game-changer for building robust utility functions, custom React hooks, or any helper that needs to access object properties dynamically without sacrificing type safety. You get auto-completion and compile-time error checking, making your code much more maintainable and refactor-friendly. Are there specific generic constraints you find yourself using repeatedly in your projects? #TypeScript #FrontendDevelopment #React #WebDevelopment #SoftwareEngineering
To view or add a comment, sign in
-
𝐀𝐫𝐞 𝐲𝐨𝐮 𝐬𝐭𝐢𝐥𝐥 𝐥𝐞𝐭𝐭𝐢𝐧𝐠 𝐓𝐲𝐩𝐞𝐒𝐜𝐫𝐢𝐩𝐭 𝐝𝐞𝐟𝐚𝐮𝐥𝐭 𝐭𝐨 `any` 𝐰𝐡𝐞𝐧 𝐚𝐜𝐜𝐞𝐬𝐬𝐢𝐧𝐠 𝐨𝐛𝐣𝐞𝐜𝐭 𝐩𝐫𝐨𝐩𝐞𝐫𝐭𝐢𝐞𝐬 𝐝𝐲𝐧𝐚𝐦𝐢𝐜𝐚𝐥𝐥𝐲? 𝐓𝐡𝐞𝐫𝐞'𝐬 𝐚 𝐛𝐞𝐭𝐭𝐞𝐫 𝐰𝐚𝐲 𝐭𝐨 𝐬𝐭𝐚𝐲 𝐭𝐲𝐩𝐞-𝐬𝐚𝐟𝐞. One common challenge in TS is creating generic functions that can access properties of an object without sacrificing compile-time type safety. Many resort to `any` or complex overloads, losing the benefits of TypeScript. The trick is combining Generics, `keyof`, and `extends` to tell the compiler exactly what to expect. Here’s a simple pattern for a type-safe `getProperty` function: ```typescript function getProperty<T, K extends keyof T>(obj: T, key: K): T[K] { return obj[key]; } interface User { id: number; name: string; email: string; } const user: User = { id: 1, name: 'Alice', email: 'alice@example.com' }; const userName = getProperty(user, 'name'); // Type of userName is 'string' - inferred correctly! const userId = getProperty(user, 'id'); // Type of userId is 'number' - perfect! // getProperty(user, 'address'); // Compiler error! 'address' is not assignable to type 'keyof User'. This is exactly what we want. ``` This pattern ensures that `key` is always a valid property of `T`, and the return type is correctly inferred as `T[K]`. No more runtime surprises or `any` casts! It's clean, reusable, and powerfully type-safe. What's your go-to TypeScript trick for maintaining type safety with dynamic data? Share in the comments! #TypeScript #FrontendDevelopment #SoftwareEngineering #React #WebDev
To view or add a comment, sign in
-
What if JSON wasn't just a data format — but a programming language? That question led me to build lambda-json: a Turing-complete, homoiconic language that lives entirely within valid JSON syntax. Every day I see JSON flowing between services — config files, API payloads, state objects. But it's always treated as inert data. What if the data could compute itself? So I built a Lisp-inspired interpreter in JavaScript where a single JSON object can contain both your code and your data — and return a computed result. Lambdas, conditionals, higher-order functions, recursion — all expressed as valid JSON. It's been one of those projects that fundamentally changed how I think about the boundary between code and data — and how that boundary shows up in frontend architecture every day. Link to the repo in the comments. What's the side project that changed how you think about your day job? I'd love to hear. #OpenSource #JavaScript #ProgrammingLanguages #SoftwareEngineering #FunctionalProgramming
To view or add a comment, sign in
-
✨ Partitioning in Distributed Systems As systems scale, handling massive amounts of data and traffic becomes a real challenge — and that’s where partitioning comes into play. In today’s post, I’ve explained partitioning in distributed systems in a simple and structured way, helping you understand how large systems split data across multiple machines to improve scalability, performance, and reliability. If you’re curious about how big tech systems handle millions of users without crashing, this concept is a must-know. 👇 Have you explored distributed system concepts before, or is this your first step into system design? Follow Muhammad Nouman for more useful content #learningoftheday #1000daysofcodingchallenge #FrontendDevelopment #WebDevelopment #JavaScript #React #Next #CodingCommunity #DistributedSystems #SystemDesign #Scalability #BackendDevelopment #TechLearning
To view or add a comment, sign in
-
Debugging at small scale is annoying; Debugging at scale is expensive. When a frontend codebase grows, abstraction doesn’t just add structure — it adds distance between cause and effect. - A state update triggers an effect. - That effect updates derived state. - That derived state triggers another effect. - A memoized function masks the real dependency. Now a simple bug requires tracing a chain of indirection. - The issue isn’t React. - The issue isn’t hooks. - The issue is layered runtime abstraction. At scale, debugging complexity grows faster than feature complexity. You don’t just ask: “Why is this value wrong?” You ask: - “Which lifecycle triggered this?” - “Which dependency changed?” - “Why did this re-render twice?” - “Which abstraction is hiding the real data flow?” Every layer of indirection adds mental hops; And mental hops slow teams down. This is where architectural philosophy matters. In compile-first systems like Svelte, the dependency graph is explicit. - State is declared with $state(). - Derived relationships use $derived(). - Side effects are isolated with $effect(). The compiler understands the flow. That reduces hidden coupling and makes debugging closer to tracing plain JavaScript rather than tracing a rendering engine. When systems scale, clarity becomes more valuable than flexibility. Tomorrow, we move into something foundational: Reactivity — explained without magic words. No mysticism. No framework folklore. Just a clear look at how data actually flows. #Svelte #Svelte5 #FrontendDevelopment #SoftwareArchitecture #ReactJS #WebEngineering #Maintainability #CompiledSpeed #SvelteWithSriman
To view or add a comment, sign in
-
TypeScript just rewrote its compiler in Go. A 40-second build now takes 4 seconds. In March 2025, Anders Hejlsberg announced the TypeScript team is porting tsc to native Go. The project — tsgo — delivers up to 10× faster type-checking on real codebases. Most developers don't realize what happens when they run tsc. The compiler re-parses the entire project in a single JavaScript thread. No parallelism. No native speed. Just V8 doing its best with a million-line compiler codebase. // Current tsc — single-threaded, JIT-compiled $ time tsc --noEmit // 650-file monorepo: 39.6 seconds // Cannot parallelize type resolution The Go port changes the equation: // tsgo — multi-threaded, natively compiled $ time tsgo --noEmit // Same monorepo: 17.5s cold, 1.3s warm // Parallel parsing across all CPU cores Why Go instead of Rust? TypeScript's compiler relies on deeply shared mutable data structures with circular references. Rust's ownership model would have forced a full architecture redesign. Go's garbage collector handles this naturally — maximum speed with minimum risk. A developer benchmarking tsgo on a 650-file SvelteKit monorepo measured: • Cold check: 17.5s vs 39.6s (2.3× faster) • Warm incremental: 1.3s vs 39.4s (30× faster) • Iterative rebuild: 2.5s vs 39.8s (16× faster) When this doesn't apply: • Projects under 50 files won't feel a meaningful difference • Language Server integration follows a separate migration timeline • Pipelines bottlenecked by I/O or bundling won't see the full 10× gain Microsoft has tested tsgo internally, and the preview is already available. Early ecosystem tools like svelte-check-rs are building directly on the native compiler. This isn't an experiment. The TypeScript compiler will ship as a native Go binary. What's your biggest tsc pain point — type-checking speed, editor lag, or CI build times? #TypeScript #WebDev #DeveloperTools #Performance
To view or add a comment, sign in
-
-
👉 Binary Tree Maximum Path Sum Using DFS --------------------------------------------------------------------------------------- 43 🔥 The challenge is to find the maximum sum of any path in a binary tree, where a path can start and end at any node (not necessarily passing through the root). ✔ Traverse the tree using Depth First Search (DFS) ✔ Ignore negative paths using Math.max(0, childSum) ✔ Maintain a global maximum path sum ✔ Return only one side (left or right) to the parent to maintain a valid path #DataStructures #Algorithms #BinaryTree #JavaScript #TypeScript #DSA #CodingPractice #ProblemSolving #SoftwareEngineering #FrontendDeveloper #LearningInPublic #100DaysOfCode
To view or add a comment, sign in
-
-
🚀 Built a Full-Stack Pipeline Builder with Real-Time Graph Analysis! I’m excited to share my latest project: a Modular Pipeline Builder that allows users to construct complex logic workflows through an intuitive drag-and-drop interface. As a CSE student, I wanted to dive deep into graph theory and scalable frontend architecture. This project was a great challenge in balancing UI/UX with rigorous backend validation. Key Technical Highlights: Modular Architecture: Built a reusable BaseNode React component, allowing for easy expansion of node types (Logic Gates, Timers, Databases, and more). Dynamic Variable Handling: Implemented real-time Regex parsing in the Text Node to dynamically generate input handles for {{variable}} syntax. Graph Validation: Developed a FastAPI backend that utilizes Kahn’s Algorithm (BFS) to perform Directed Acyclic Graph (DAG) checks, preventing infinite loops in the workflow. State Management: Managed complex node/edge interactions using Zustand for a clean, decoupled state. Check out the demo below to see the DAG validation and modular node system in action! #WebDevelopment #ReactJS #FastAPI #GraphTheory #SoftwareEngineering #Zustand #FullStack #Python #ReactFlow
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development