Today I took a deeper dive into Node.js internals, specifically exploring how the require() function actually works behind the scenes. Earlier, I used require() simply to import modules. But today, I went beyond surface-level usage and understood how Node.js executes code internally when we load a module. What I explored: 🔹 How require() loads a module step by step Resolves the file path Checks the module cache Wraps the code inside a function Executes the file in its own module scope Returns module.exports 🔹 The Module Wrapper Function Node.js does not execute your file directly. It wraps it internally like this: (function (exports, require, module, __filename, __dirname) { // Your actual code here }); This explains: Why variables are not global by default How module.exports works Why __dirname and __filename are available 🔹 Module Caching Once a module is loaded, Node.js caches it. If you require() the same file again, it does not execute again — it returns the cached version. This improves performance and prevents duplicate execution. 🔹 Execution Flow When Node starts: It creates a main module It executes top-level code Each require() creates a new module context Modules maintain their own scope 💡 Key realization: require() is not just an import statement — it is a complete module loading system involving resolution, wrapping, execution, and caching. Understanding this gave me clarity on: Scope isolation in Node.js Performance behavior Circular dependencies How large backend applications are structured This deep dive helped me see Node.js not just as a runtime, but as a carefully engineered system designed for modular and scalable applications. Node_Git_Repo: https://lnkd.in/gz5523ZV #NodeJS #JavaScript #Require #ModuleSystem #BackendDevelopment #LearningJourney
Bharat Mani Kumar Thota’s Post
More Relevant Posts
-
🚀 New Blog Published: Node.js Internals Explained While learning backend development through the cohort by Hitesh Choudhary, I started exploring how Node.js actually works internally. A detailed explanation by Piyush Garg about the Node.js internals really helped me understand how things work behind the scenes. So I decided to document my learning in a blog. In this article I explain: • How the V8 Engine executes JavaScript • How libuv enables asynchronous operations • How the Node.js Event Loop works • Node.js internal architecture with diagrams 🔗 Read the full blog here: https://lnkd.in/dhiWGXTC Feedback from the community would be really valuable 🙌 #nodejs #javascript #backenddevelopment #webdevelopment #developers
To view or add a comment, sign in
-
If you're still treating npm as the only game in town, you're missing the modern upgrade the JavaScript ecosystem has been waiting for. Here's why I'm all-in on this approach, especially when setting up CI/CD pipelines. The Parallels (They're More Similar Than You Think) 1. Both are open-source package registries for sharing JavaScript & TypeScript code. 2. npm's insane success (2M+ packages, billions of downloads) is exactly what JSR builds on — it's not a rival, it's the next evolution. 3. You can mix packages from both in the same project. 4. Both support package.json, lockfiles, and the entire Node.js/Bun ecosystem. Think of it like TypeScript vs JavaScript: one is the superset that makes the other better. So… Which One Is Better? 1. Core Design npm: JavaScript-first, supports both CommonJS (CJS) + ESM JSR: TypeScript-native + ESM-only (no CJS support) 2. Publishing npm: You publish compiled JavaScript + manually written/added .d.ts type declarations JSR: Publish raw TypeScript- automatically generates types, documentation, and transpilation for multiple runtimes 3. Runtime Support npm: Primarily Node.js (works elsewhere with extra config) JSR: Native support for Node.js, Deno, Bun, Cloudflare Workers, browsers- zero extra setup 3. Installation npm: Standard npm install / pnpm add / yarn add JSR: npx jsr add @scope/package or modern package managers with jsr: specifier / npm registry mirror 4. CI/CD Friendliness npm: Pure npm commands, lockfiles work as-is JSR: Full npm compatibility via @jsr scope + https://npm.jsr.io registry — works seamlessly with plain npm ci 5. Developer Experience & Security npm: Manual everything (docs, types, publishing steps) JSR: Auto-generated beautiful API docs, immutable packages, OIDC authentication, global edge CDN for fast installs In my case, I have exclusively created libraries for npm or JSR, but the main thing about JSR is that I publish once to JSR and it works everywhere. In CI/CD you just add one line to .npmrc (@jsr:registry=https://npm.jsr.io) or run npx jsr add once and you're done. No extra steps, no lockfile drama, pure npm ci works perfectly. You don't have to choose sides. JSR is literally npm's superset — the best of both worlds. If you're starting a new library, API, or even a side project this year… publish to JSR. Your users on npm will thank you (and your CI will be cleaner).
To view or add a comment, sign in
-
-
🚀 Building scalable REST APIs? NestJS is worth your attention NestJS is a progressive Node.js framework that brings structure, scalability, and TypeScript-first development to your backend — inspired by Angular’s architecture. Here’s how clean a NestJS controller looks: // users.controller.ts @Controller('users') export class UsersController { constructor(private readonly usersService: UsersService) {} @Get() findAll() { return this.usersService.findAll(); } @Post() create(@Body() createUserDto: CreateUserDto) { return this.usersService.create(createUserDto); } } Why NestJS stands out for API development: ✅ Modular architecture — Controllers, Providers, Modules ✅ Built-in Guards, Pipes, Interceptors & Middleware ✅ First-class TypeScript support ✅ Works with TypeORM, Prisma, Mongoose out of the box ✅ GraphQL, WebSockets & Microservices support built-in ✅ OpenAPI/Swagger integration with zero hassle Official docs to bookmark: 📖 Getting Started → https://lnkd.in/gKB5w_Em 📖 Controllers → https://lnkd.in/gzuGx3_N 📖 Authentication → https://lnkd.in/gWSRusng 📖 Database → https://lnkd.in/gSgyh9Hy If you’re building serious backend APIs with Node.js, NestJS gives you the structure that Express never could. What backend framework are you currently using? 👇 #NestJS #NodeJS #TypeScript #BackendDevelopment #API #WebDevelopment
To view or add a comment, sign in
-
Nodejs Development Company About Node.js® As an asynchronous event-driven JavaScript runtime, Node.js is designed to build scalable network applications. In the following "hello world" example, many connections can be handled concurrently. Upon each connection, the callback is fired, but if there is no work to be done, Node.js will sleep. https://lnkd.in/gMQ_NE6J
To view or add a comment, sign in
-
🚀 Node.js is more than just JavaScript. Most developers use Node.js daily. But very few understand what happens when a request hits the server. How can a single-threaded runtime handle thousands of requests? The answer lies in: ⚡ libuv ⚡ The Event Loop ⚡ Express Middleware pipelines In my latest deep dive, I explain: ✅ How libuv gives JavaScript async superpowers ✅ The 5 phases of the Event Loop ✅ Why setImmediate matters for I/O tasks ✅ How the Express middleware pipeline actually works Once you understand this, you stop guessing how your backend works. You start designing systems better. Thanks to Hitesh Choudhary Sir, Piyush Garg, Jay Kadlag 📖 Read the full article: https://lnkd.in/dngtRqk6 #NodeJS #BackendDevelopment #JavaScript #ExpressJS #WebDevelopment
To view or add a comment, sign in
-
This article explores the complexities of extracting text from PDFs, particularly for JavaScript developers. I found it interesting that choosing the right library can significantly impact the efficiency of your solution. What strategies have you used to tackle similar challenges in your projects?
To view or add a comment, sign in
-
Understanding Node.js goes well beyond knowing how to write JavaScript on the server. The real value comes from understanding why it works the way it does. The third article in a five-part series on RESTful APIs covers the internals that make Node.js a strong choice for building APIs and server-side applications. The Event Loop architecture and how it enables asynchronous, non-blocking operations on a single thread. The progression from callbacks to promises to async/await, and how each layer of abstraction connects back to the same underlying queue system. NPM and package management best practices. Buffers and streams for efficient data processing. And the Node.js I/O API that gives JavaScript access to operating system resources that browsers deliberately restrict. For developers building with Node.js or evaluating it for a project, this is the kind of depth that separates confident usage from guesswork. Read the full article: https://lnkd.in/dEs4UFRx #WebDevelopment #NodeJS #JavaScript
To view or add a comment, sign in
-
JavaScript ecosystem in 2026 — but from an Angular + Nx perspective. A lot of conversations are centered around Vite, Turborepo, Bun, etc. But in large-scale Angular monorepos the reality looks slightly different. Here’s how I see it: 📦 Package management - pnpm is the most practical choice for monorepos. Strict dependency resolution + disk efficiency really matter at scale. ⚡ Bundling & build tooling - in Angular projects, the CLI and Nx abstractions define the build pipeline. We’re seeing more esbuild / SWC integrations under the hood — performance is clearly the direction. ⚙️ TypeScript first - in Angular ecosystems, TypeScript is not optional - it’s foundational. Fast compilers (SWC, esbuild) are becoming build accelerators, but type safety remains the core. 🏃♂️ Runtime - Node.js is still the default in enterprise environments. Stability > experimentation. ✨ Linting & formatting - rust-based tools (Oxlint, Biome) are extremely promising performance-wise, especially in large Nx workspaces. Faster linting directly impacts CI time. 🧪 Testing - unit & integration testing stay close to Angular tooling. For E2E, Playwright is becoming the serious alternative to Cypress in enterprise setups. 📂 Monorepo orchestration - this is where Nx shines. Dependency graphs, task orchestration, caching, affected builds — these are not “nice to have” features in enterprise projects. They’re mandatory. 🎨 Styling - tailwind continues to dominate even inside Angular projects. Utility-first styling scales surprisingly well when paired with a strong design system. → My principle in 2026: I don’t optimize for hype. I optimize for: - predictable builds - fast CI - strict architecture boundaries - developer productivity at scale In Angular monorepos, Nx remains one of the strongest strategic choices.
JavaScript tools ecosystem in 2026 + my recommendations: 📦 Package management — npm remains the default, with pnpm gaining significant traction for its efficient disk usage and strict dependency resolution. Yarn (especially Yarn Berry with PnP) is still used but has lost momentum relative to pnpm. → I recommend pnpm. Yarn v6 is being rewritten in Rust, and seems faster than pnpm, do keep a lookout! ⚡ Bundlers — Vite has become the de facto standard for most new projects, using esbuild for dev and Rollup under the hood for production builds (Rolldown in future for everything). webpack is still widespread in legacy codebases. Turbopack (from Vercel) has replaced webpack in Next.js apps. → I recommend Vite if you're not using Next.js. VoidZero (company behind Vite) has built other tools and their ecosystem has great synergy. ⚙️ Transpilers / compilers — SWC (Rust-based) has largely replaced Babel in modern toolchains. TypeScript's own tsc handles type-checking, while SWC and oxc handle the actual transpilation for speed. Babel still exists but is increasingly a legacy choice. → I recommend TypeScript, which is being rewritten in Golang, blazing fast. 🏃♂️ Runtimes — Node.js is the incumbent. Deno (from Node's original creator) offers a more secure, standards-aligned alternative. Bun is a newer all-in-one runtime that also bundles, transpiles, and manages packages, positioning itself as a Node replacement. → I personally use Node.js, but the others are great as well. ✨ Linting & formatting — ESLint remains standard for linting, though Biome (Rust-based) is gaining ground as a unified linter + formatter. Prettier is still the dominant formatter but Biome is a credible alternative. Oxlint and Oxfmt by VoidZero are getting ready for prime time. → I'm moving all my projects to Oxlint and Oxfmt. Rust-based tooling is so much faster and they integrate better with VoidZero's tools. 🧪 Testing — Vitest has overtaken Jest for new projects due to its Vite integration and speed. Playwright has become the go-to for E2E testing, largely displacing Cypress. Testing Library remains the standard for component-level tests. → I recommend Vitest and Playwright. 📂 Monorepo tools — Turborepo and Nx are the main options for managing monorepos at scale, handling task orchestration, caching, and dependency graphs. → I primarily use Turborepo, however Vite might also start doing monorepo task orchestration, keep a look out. 🎨 CSS tooling — Tailwind CSS dominates utility-first styling. CSS Modules, styled-components, and CSS-in-JS solutions (like Panda CSS and vanilla-extract) fill other niches. PostCSS remains a common processing layer, and Lightning CSS (Rust-based) is an emerging alternative. → I use Tailwind. Tailwind v4 uses Lightning CSS under the hood. What do you use? ——— ♻ Repost to help others discover 📕 Save the post so you don't miss it 💡 Follow me Yangshun Tay and my company GreatFrontEnd for more
To view or add a comment, sign in
-
-
When building APIs with Node.js and Express.js, the way you call APIs can significantly impact performance. A common mistake many developers make is calling APIs one by one (sequentially) when the requests are independent. 1️⃣ Sequential API Calls (One by One) In this approach, each API waits for the previous one to complete. JavaScript const user = await getUser(); const orders = await getOrders(); const payments = await getPayments(); Here, every request blocks the next one. If each API takes 500ms, the total time becomes: 500ms + 500ms + 500ms = 1500ms This increases response time and slows down your backend. 2️⃣ Parallel API Calls (All at Once) If the APIs are independent, you can run them in parallel using Promise.all(). JavaScript const [user, orders, payments] = await Promise.all([ getUser(), getOrders(), getPayments() ]); Now all requests run simultaneously, so the total time becomes roughly: ~500ms instead of 1500ms Why this matters Optimizing API calls can dramatically improve: • Backend performance • API response time • User experience • Server efficiency Simple Rule Use sequential calls only when one API depends on another. Otherwise, use parallel execution with Promise.all(). Small backend optimizations like this can make a huge difference at scale. #NodeJS #ExpressJS #BackendDevelopment #JavaScript #API #WebDevelopment #SoftwareEngineering
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development