The Most Underused Node.js Feature That Fixes Slow APIs: Worker Threads Most Node.js performance issues don’t come from networking… They come from CPU-heavy tasks choking the event loop. If your API is “randomly slow”, freezes under load, or your /health endpoint looks fine while users scream, you’re probably blocking the event loop without realizing it. And the funny thing? Node.js already shipped the solution years ago, but very few developers use it. Worker Threads. A simple way to move CPU-bound work off the main thread so your API stays fast and responsive. Why Worker Threads Matter Hashing? Move it to a worker. Image/PDF processing? Worker. Large JSON parsing? Worker. ML inference? Worker. Heavy loops or calculations? Worker. Your API should never freeze because of a CPU task. Worker Threads make sure it doesn’t. Why Most People Ignore Them Because “Node.js is single-threaded” is the lie we all grew up with. The truth? Node is single-threaded for JS but multi-threaded under the hood and Worker Threads let you tap into that power safely. My Go-To Pattern Use the main thread only for: I/O Routing Lightweight logic Push all heavy lifting to: Worker Pools Dedicated Worker Scripts Background processes When Should You Use Worker Threads? Use them when your bottleneck is: CPU Parsing Encryption Data crunching Anything with a long synchronous execution time Don’t use them for: Standard DB/API calls Basic controller logic Pure I/O The biggest benefit? Instead of scaling your servers early ($$$), you squeeze maximum performance out of one. Have you used Worker Threads in production yet? If yes, what kind of tasks did you offload? If not, what's stopping you from trying them? #NodeJS #JavaScript #WebDevelopment #Backend #PerformanceOptimization #FullStackDeveloper #SoftwareEngineering #TechInsights #Developers #NodejsUAE
Ayman uddin’s Post
More Relevant Posts
-
🚨 Most 𝐍𝐨𝐝𝐞.𝐣𝐬 𝐚𝐩𝐩𝐬 𝐚𝐫𝐞 𝐚𝐬𝐲𝐧𝐜, yet they still block the event loop. Everyone knows 𝐚𝐬𝐲𝐧𝐜/𝐚𝐰𝐚𝐢𝐭 Very few understand what actually blocks 𝑵𝒐𝒅𝒆.𝒋𝒔 in production : 🧠 CPU-bound work does not care about async - 🔹 Common mistakes I still see in real systems • Password hashing done inside API handlers • 𝘑𝘚𝘖𝘕.𝘴𝘵𝘳𝘪𝘯𝘨𝘪𝘧𝘺 on very large payloads • PDF or image processing during requests • Heavy data aggregation written in JavaScript instead of the database. ⚠️ Even with 𝐚𝐬𝐲𝐧𝐜/𝐚𝐰𝐚𝐢𝐭, Node executes this work on the main thread When this happens, the event loop is blocked and latency increases ❌ Things many developers miss • 𝘈𝘴𝘺𝘯𝘤 does not mean non-blocking • Promises do not move work off the main thread • Node is single-threaded by design ✅ Correct way to think about it • Async is enough for IO-bound work • CPU-bound work must run in 𝘸𝘰𝘳𝘬𝘦𝘳_𝘵𝘩𝘳𝘦𝘢𝘥𝘴, background jobs, or queues 🛠 How I handle this in production • APIs stay thin and only orchestrate • Heavy logic runs in workers or scheduled jobs • The event loop stays free and latency stays predictable 🔥 If your Node API slows down under load, this is often the reason Not MongoDB Not Nginx Not traffic Most people learn Node.js Very few learn how the 𝐍𝐨𝐝𝐞.𝐣𝐬 runtime actually works 🤝 Let’s connect if this resonates #NodeJS #BackendEngineering #EventLoop #SystemDesign #ScalableSystems #PerformanceEngineering #JavaScript #APIDevelopment #WebPerformance #EngineeringMindset
To view or add a comment, sign in
-
Node.js is single-threaded. Yet it handles thousands of concurrent requests. How? I still see many developers equate 𝗮𝘀𝘆𝗻𝗰 with 𝗺𝘂𝗹𝘁𝗶𝘁𝗵𝗿𝗲𝗮𝗱𝗶𝗻𝗴. They’re not the same thing. Node.js runs your JavaScript on a single main thread. There is no pool of threads executing your logic in parallel. So what’s really happening? The secret lies in 𝗻𝗼𝗻-𝗯𝗹𝗼𝗰𝗸𝗶𝗻𝗴 𝗜/𝗢 and the 𝗘𝘃𝗲𝗻𝘁 𝗟𝗼𝗼𝗽. When Node.js performs I/O (network calls, file reads, DB queries), it doesn’t block the main thread to wait for the result. Instead, it delegates those operations to the underlying system (kernel) and keeps moving. Once the work is done, the system signals Node, and the callback is queued. The Event Loop then picks it up—quickly and efficiently—without ever blocking execution. That’s why Node can handle thousands of concurrent connections: 𝗡𝗼𝘁 𝗯𝗲𝗰𝗮𝘂𝘀𝗲 𝗶𝘁 𝗱𝗼𝗲𝘀 𝗺𝗮𝗻𝘆 𝘁𝗵𝗶𝗻𝗴𝘀 𝗮𝘁 𝗼𝗻𝗰𝗲, 𝗯𝘂𝘁 𝗯𝗲𝗰𝗮𝘂𝘀𝗲 𝗶𝘁 𝗻𝗲𝘃𝗲𝗿 𝘄𝗮𝗶𝘁𝘀. • 𝗔𝘀𝘆𝗻𝗰 helps you wait efficiently. • 𝗠𝘂𝗹𝘁𝗶𝘁𝗵𝗿𝗲𝗮𝗱𝗶𝗻𝗴 helps you work simultaneously. Different tools. Different problems. Understanding this distinction matters. It affects how you design APIs, how you handle CPU-intensive tasks, and how you reason about scalability. 𝗔𝘀𝘆𝗻𝗰 ≠ 𝗠𝘂𝗹𝘁𝗶𝘁𝗵𝗿𝗲𝗮𝗱𝗶𝗻𝗴.
To view or add a comment, sign in
-
-
𝐌𝐮𝐭𝐚𝐭𝐢𝐨𝐧 𝐢𝐧 𝐍𝐞𝐱𝐭.𝐣𝐬: 𝐓𝐡𝐞 𝐆𝐚𝐦𝐞 𝐂𝐡𝐚𝐧𝐠𝐞𝐫 𝐟𝐨𝐫 𝐃𝐚𝐭𝐚 𝐎𝐩𝐞𝐫𝐚𝐭𝐢𝐨𝐧𝐬! Today I explored how Mutations using Server Actions in Next.js make our apps faster, cleaner & more scalable. No more juggling between API routes, client-server round trips & complex state syncing, Server Actions simplify everything at once! 𝑾𝒉𝒂𝒕 𝒊𝒔 𝑴𝒖𝒕𝒂𝒕𝒊𝒐𝒏 𝒊𝒏 𝑵𝒆𝒙𝒕.𝒋𝒔? Mutation = Updating data on the server (create, update, delete). In Next.js, we can do this directly inside Server Actions, without setting up an entire backend endpoint. Just attach "use server" at the top of the function, call it in a form/action and boom, the server handles everything securely. 𝑾𝒉𝒚 𝑺𝒆𝒓𝒗𝒆𝒓 𝑨𝒄𝒕𝒊𝒐𝒏𝒔 𝒂𝒓𝒆 𝒂 𝑷𝒐𝒘𝒆𝒓 𝑴𝒐𝒗𝒆? ✨ No API routes required - write backend logic directly inside components ⚡ Reduced network round-trips - server executes instantly, faster updates 🔒 More secure - logic stays on server, no sensitive code on client 📦 Less boilerplate - fewer files, cleaner architecture 💾 Built-in revalidation - UI updates automatically after mutation 🌍 Perfect for forms & CRUD apps - just call action & mutate data 𝑴𝒊𝒏𝒊 𝑬𝒙𝒂𝒎𝒑𝒍𝒆 👇 "use server" export async function addTodo(data){ const todo = data.get("todo"); await db.todo.create({ text: todo }); } Call it directly from your form No API routes. No axios. Just pure DX. ➤ Next.js is not just evolving, it’s redefining the full-stack workflow. If you're still shipping API routes for small mutations, you're missing the fun (Day 15 of my Next.js learning series) #Nextjs #Reactjs #JavaScript #FrontendDevelopment #WebDevelopment
To view or add a comment, sign in
-
-
Node.js doesn’t slow down because it’s single-threaded... It slows down because people don’t understand how it actually runs. Yes — your JavaScript runs on a single call stack. ➡️ One thing at a time. That part is true. But Node.js was never designed to do everything there. When your code touches files, networks, or timers, it crosses a boundary... Through native bindings, that work is handed off to libuv ⚙️ And libuv is very intentional... 🌐 Network I/O operations? → Delegates straight to the OS kernel → No threads, no blocking → Kernel notifies Node only when data is ready 📂 File system & CPU-heavy operations? → Sent to a thread pool → Default size: 4 threads Here’s the part most people miss 👇 When those threads finish, callbacks don’t run immediately... They get queued for the Event Loop 🔄 The Event Loop runs in phases, across cycles: → timers → I/O callbacks → poll → check One phase per tick. So when you trigger thousands of file reads, you’re not just competing for threads — you’re feeding the Event Loop a growing queue that must be drained gradually. Nothing crashes. Now add execution priority... • Synchronous code always runs first • Microtasks jump the queue (process.nextTick > Promises) • Only then does the loop move forward This is why Node.js feels ⚡blazing fast when used correctly… Once this architecture clicks, you stop writing “working” code you start writing scalable code. You know what blocks the system. You know what scales effortlessly. If this changed how you think about Node, drop a “⚙️” in the comments. P.S. Huge thanks to ByteMonk 🙏 These two deep dives genuinely helped refine my mental model of Node.js internals and the Event Loop... ▶️ https://lnkd.in/gehGCT5G ▶️ https://lnkd.in/gpSggK2a If you care about how Node actually works under the hood, these are must-watches!
To view or add a comment, sign in
-
-
𝗡𝗼𝗱𝗲.𝗷𝗦 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲 𝗘𝘅𝗽𝗹𝗮𝗶𝗻𝗲𝗱 𝗦𝗶𝗺𝗽𝗹𝘆 Understanding Node.js architecture is crucial for building scalable applications. Node.js is often misunderstood as being single-threaded, but it's more complex than that. The Event Loop is the mechanism that allows Node.js to perform non-blocking I/O operations despite being single-threaded. The Event Loop cycles through specific phases, including the Timers Phase, Pending Callbacks, Poll Phase, Check Phase, and Close Callbacks. It's essential to understand how Node.js schedules tasks, including the difference between microtasks and macrotasks. To scale Node.js applications, it's necessary to move CPU-intensive work off the event loop using worker threads or by delegating to a separate service. The cluster module allows you to fork multiple Node processes that share the same server port, and load balancers can distribute traffic across multiple instances of your application. Common mistakes to avoid in Node.js development include blocking the event loop with heavy computation, using synchronous APIs in production code, and not handling promise rejections. By understanding the Event Loop and respecting the single thread, you can build fast, efficient, and scalable systems. Source: https://lnkd.in/ggZbEA88 #NodeJS #EventLoop #AsyncCode #Scalability #JavaScript #WebDevelopment #SoftwareEngineering #PerformanceOptimization #CodingBestPractices
To view or add a comment, sign in
-
🚀 Is Node.js really single-threaded? Let’s clear the confusion! One of the most common misconceptions is: “Node.js is single-threaded, so it can’t handle heavy workloads.” Well… yes and no. Let’s break it down in a simple way 👇 🔹 1. JavaScript in Node.js is single-threaded The part that runs your JS code — the event loop — works on a single thread. So your functions, callbacks, async/await… all run on one main thread. 🔹 2. But Node.js itself is NOT fully single-threaded Behind the scenes, Node uses: libuv thread pool (4 threads by default) for tasks like: File system operations Crypto DNS lookups Worker Threads (optional) for CPU-intensive tasks Cluster mode to spawn multiple Node processes that use multiple CPU cores So while your JavaScript runs on one thread, Node.js uses multiple threads under the hood to keep your app fast and non-blocking. 🔹 3. Why this design? Because: Most web workloads are I/O-heavy, not CPU-heavy Single-threaded event loop = no context switching Asynchronous architecture = thousands of concurrent connections This is why Node.js shines in real-time apps, APIs, chat systems, streaming, etc. 🔹 4. When Node.js struggles Node.js can slow down when: You run long, CPU-heavy tasks on the main thread You block the event loop Solution? 👉 Move heavy logic to Worker Threads 👉 Use clustering 👉 Offload CPU tasks to microservices 💡 Conclusion: Node.js executes JavaScript on a single thread, but the Node.js runtime is multi-threaded, thanks to libuv and additional workers. So next time someone says “Node is slow because it’s single-threaded,” you can confidently tell them: “It’s not that simple!” 😄
To view or add a comment, sign in
-
-
What Actually Happens When a Request Hits a Node.js Server Node.js is often described as single-threaded and event-driven. That description is correct, but it hides the most important details that matter in production systems. When an HTTP request reaches a Node.js server, JavaScript is not the first thing that runs. The operating system receives the packet, processes it through the TCP/IP stack, and determines which process owns the listening port. Once the socket becomes readable, Node.js is notified. At this point, libuv takes control. libuv is not a helper library—it implements the event loop. It is responsible for I/O polling, timers, and scheduling callbacks. JavaScript does not actively wait for anything. Only after libuv determines that work is ready does execution move to V8, where your Express middleware and route handlers run. V8’s responsibility is limited to executing JavaScript and managing memory and garbage collection. It does not handle networking or I/O. This separation of responsibilities is fundamental to Node.js scalability, and it also explains most production issues. If JavaScript performs CPU-heavy work inside a request handler, the event loop cannot progress. Other requests do not get scheduled. Throughput drops, latency spikes, and the system appears “slow” even though the infrastructure is healthy. Async code does not make CPU work non-blocking. It only prevents blocking when the work can be delegated back to libuv or the operating system. Understanding this boundary between the OS, libuv, and V8 is the difference between writing Node.js code that works and building Node.js systems that scale. Most Node.js performance problems are not framework issues. They are execution model misunderstandings. #NodeJS #BackendEngineering #SystemDesign #PerformanceEngineering #Scalability
To view or add a comment, sign in
-
📄 Day 9: Pagination & filtering in Node + Express Today I hit a common backend problem. What happens when your API has too much data? Returning everything at once works… until it doesn’t 😅 So today was about pagination and filtering. 1️⃣ Pagination: send data in chunks Instead of returning all records, send a small set. const page = Number(req.query.page) || 1; const limit = Number(req.query.limit) || 10; const skip = (page - 1) * limit; const users = await User.find().skip(skip).limit(limit); Cleaner responses. Faster APIs. 2️⃣ Filtering: return only what’s needed Let users filter results using query params. const { role } = req.query; const users = await User.find( role ? { role } : {} ); Same endpoint. Smarter results. 3️⃣ Why this matters Pagination and filtering: • reduce load on the server • improve frontend performance • make APIs scalable Small change. Big impact. 🔑 Key takeaway Good APIs don’t send more than needed. They send exactly what’s useful. Day 10 will be about sorting, searching, and indexing for speed. #️⃣ #nodejs #expressjs #pagination #backenddevelopment #learninginpublic #javascript #api
To view or add a comment, sign in
-
-
⚡️ Advanced JavaScript Feature: Promise.race() If you’re building modern applications that deal with APIs, databases, microservices, or external systems, you’ve probably hit scenarios where speed matters more than which promise succeeds. That’s exactly where Promise.race() becomes a powerful tool. 🔥 What Promise.race() Does It returns the first promise that settles — whether it fulfills or rejects. It's a race — the earliest result wins. 📌 Example const p1 = fetch("https://slow-api.com/data"); // slow const p2 = fetch("https://fast-api.com/data"); // fast const p3 = new Promise((_, reject) => setTimeout(() => reject("Timeout!"), 3000) ); Promise.race([p1, p2, p3]) .then(result => console.log("Winner:", result)) .catch(err => console.error("Failed:", err)); If the timeout promise rejects first → the race ends with a failure. If the fast API returns first → you get the data immediately. 🎯 Why This Matters Perfect for implementing timeouts Improves responsiveness Helps avoid long waits on slow or stuck APIs Simple but powerful pattern for real-time apps #javascript #frontend #webdevelopment #apiintegration #frontenddeveloper
To view or add a comment, sign in
-
Day 43 of #100DaysOfCode 𝐀𝐧𝐠𝐮𝐥𝐚𝐫 𝐇𝐓𝐓𝐏 𝐈𝐧𝐭𝐞𝐫𝐜𝐞𝐩𝐭𝐨𝐫𝐬: 𝐓𝐡𝐞 𝐆𝐥𝐨𝐛𝐚𝐥 𝐏𝐨𝐰𝐞𝐫𝐡𝐨𝐮𝐬𝐞 𝐟𝐨𝐫 𝐉𝐖𝐓 & 𝐄𝐫𝐫𝐨𝐫𝐬 If you find yourself manually adding headers to every 𝐻𝑡𝑡𝑝𝐶𝑙𝑖𝑒𝑛𝑡 call or writing the same 𝑡𝑟𝑦/𝑐𝑎𝑡𝑐h block for every API request, you need HTTP Interceptors. Think of an Interceptor as a "Security Gate" or a "Middleman" that sits between your Angular app and your backend. Every request going out and every response coming in passes through it. 𝟏. 𝐇𝐚𝐧𝐝𝐥𝐢𝐧𝐠 𝐉𝐖𝐓 𝐓𝐨𝐤𝐞𝐧𝐬 𝐀𝐮𝐭𝐨𝐦𝐚𝐭𝐢𝐜𝐚𝐥𝐥𝐲 Instead of injecting a token in every service, an Interceptor can intercept the outgoing request, clone it, and inject the 𝐴𝑢𝑡𝑜𝑟𝑖𝑧𝑎𝑡𝑖𝑜𝑛 header globally. 𝟐. 𝐆𝐥𝐨𝐛𝐚𝐥 𝐄𝐫𝐫𝐨𝐫 𝐋𝐨𝐠𝐠𝐢𝐧𝐠 & 𝐇𝐚𝐧𝐝𝐥𝐢𝐧𝐠 Interceptors are also the perfect place to catch HTTP errors (like 401 Unauthorized or 500 Server Error) before they reach your components. 𝐖𝐡𝐲 𝐢𝐬 𝐭𝐡𝐢𝐬 𝐚 "𝐂𝐥𝐞𝐚𝐧 𝐂𝐨𝐝𝐞" 𝐦𝐮𝐬𝐭-𝐡𝐚𝐯𝐞? • 𝐃𝐑𝐘 (𝐃𝐨𝐧'𝐭 𝐑𝐞𝐩𝐞𝐚𝐭 𝐘𝐨𝐮𝐫𝐬𝐞𝐥𝐟): You write your auth logic once. Every new service you create automatically becomes "secure" without extra code. • 𝐂𝐞𝐧𝐭𝐫𝐚𝐥𝐢𝐳𝐞𝐝 𝐋𝐨𝐠𝐢𝐜: If your backend changes its token format or error structure, you only change code in one file. • 𝐒𝐞𝐩𝐚𝐫𝐚𝐭𝐢𝐨𝐧 𝐨𝐟 𝐂𝐨𝐧𝐜𝐞𝐫𝐧𝐬: Your data services stay focused on what data to fetch, while the Interceptor handles the how (security, logging, retries). #Angular #TypeScript #WebDevelopment #CleanCode #RxJS #SoftwareArchitecture
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
Ur posts regarding nodejs helps me a lott... Tbh this is a new concpt that i came acrss.