I used to run npm install everywhere without thinking twice… until I started working on real projects 👀 That’s when I discovered npm ci — and honestly, it changed how I handle builds. Here’s the simple difference 👇 👉 npm install (npm i) Good for development. It installs dependencies and can even update your lock file. Super flexible, but not always predictable. 👉 npm ci This one is strict. It installs exactly what’s in your package-lock.json, deletes node_modules first, and gives you a clean, consistent setup every time. And if you’re building for production: npm ci --omit=dev No dev dependencies, faster install, smaller builds 🔥 💡 What I follow now: While coding → npm i While deploying / Docker / CI → npm ci Small change, but it saves you from those “it works on my machine” headaches 😅 #NodeJS #DevOps #Docker #JavaScript #WebDevelopment
npm Install vs npm Ci: Simplifying Builds
More Relevant Posts
-
We use Git every day across the tech industry, but very few engineers actually understand how it works under the hood. So I built my own version of Git from scratch in TypeScript and Node.js: git-ts Instead of wrapping existing libraries, I implemented Git’s core mechanics directly: - Writing and reading objects using the same binary format as native Git - Hashing file contents with SHA-1 and compressing them with zlib - Parsing and updating the staging area (index file) - Managing branches by manipulating raw references - Traversing commit history using graph algorithms (BFS) to support operations like log and fast-forward merge The result: You can create a commit with git-ts, and the native Git CLI will recognize it seamlessly. That constraint (full compatibility with Git’s internal data structures) was the most challenging and rewarding part of the project. This pushed me far outside typical web development: - Working with binary buffers instead of JSON - Understanding Git as a content-addressable filesystem - Implementing low-level file system operations and graph traversal logic One realization that stuck with me: A Git branch is just a text file pointing to a 40-character hash. If you want to take a look (or roast it), here’s the repo: https://lnkd.in/dH9-UEUF #git #typescript #nodejs #software #VersionControl #DevTools
To view or add a comment, sign in
-
⚡ 𝗻𝗽𝗺 𝗶𝗻𝘀𝘁𝗮𝗹𝗹 𝘃𝘀 𝗻𝗽𝗺 𝗰𝗶 — Quick Dev Tip Most developers use these daily, but using the right one actually matters 👇 🔧 𝗻𝗽𝗺 𝗶𝗻𝘀𝘁𝗮𝗹𝗹 • Best for development • Installs from package.json • Can update dependencies ⚡ 𝗻𝗽𝗺 𝗰𝗶 • Best for CI/CD & production • Installs exact versions from package-lock.json • Fast, clean & consistent every time 💡 𝗦𝗶𝗺𝗽𝗹𝗲 𝗿𝘂𝗹𝗲: 👉 Local work → npm install 👉 Builds & deployments → npm ci Avoid “works on my machine” issues by choosing the right command 🚀 #nodejs #javascript #npm #webdev #devtips
To view or add a comment, sign in
-
-
Waitt… if you're still using git clone like this: git clone <repo-url> Then you're probably downloading way more than you actually need 👀 Better approach?? Here u go ;) ✨ git clone --depth 1 <repo-url> honestly… it's one of those small things that just makes sense once you know it. When you normally clone a repo, Git pulls everything literally the entire history, every commit, every change since day one. But with --depth 1 {aka a shallow clone}, you only get the latest version of the code. No history. No extra baggage. Just what you need. Which means way faster cloning, Saves disk space 🧠 and hell cleaner when you just want to use the project If you've never tried this before, go ahead and give it a shot ;) Follow Sakshi Jaiswal ✨ for more quality content like this. #Frontend #React #Sakshi_Jaiswal #FullstackDevelopment #javascript #TechTips #Git #Clone #flags #Webdev
To view or add a comment, sign in
-
Most React projects don’t fail because of React. They fail because the setup around the code is weak. No tests. No proper CI. No clear commit flow. No real quality checks before shipping. So I put together a React + TypeScript + Vite starter with the things I think should be there from day one: - strict TypeScript - unit and E2E tests - linting and formatting - commit conventions and git hooks - CI with GitHub Actions - automatic deploy to GitHub Pages Nothing flashy. Just a cleaner, more professional baseline for building React projects properly from the start. Repo: https://lnkd.in/e_eweNAe
To view or add a comment, sign in
-
-
🚀 npm install vs npm ci Most developers use `npm install` daily… But in production & CI/CD, `npm ci` is the real hero ⚡ Let’s break it down 👇 --- 🔹 1. npm install 👉 What it does: * Installs dependencies from `package.json` * If `package-lock.json` exists → tries to match it * But can update versions based on rules like `^` and `~` 👉 Example: If version is `"^1.2.0"` It may install `1.3.0` instead of exact `1.2.0` ⚠️ Problem: * Different developers may get different versions * “Works on my machine” issue 😅 --- 🔹 2. npm ci 👉 What it does: * Installs EXACT versions from `package-lock.json` 🔒 * Deletes `node_modules` before install * Does NOT update anything 👉 Key Features: ✅ Faster than npm install ✅ Fully consistent installs ✅ No surprises --- 💡 When to use what? 👉 Use `npm install` * While developing * When adding new packages 👉 Use `npm ci` * In CI/CD pipelines * For production builds * When you want exact reproducibility --- 🔥 One-Line Summary: npm install → flexible but inconsistent npm ci → strict, fast & reliable #NodeJS #Angular #WebDevelopment #JavaScript #DevTips #NPM #SoftwareEngineering
To view or add a comment, sign in
-
I wasted hours on slow Docker builds before I understood one thing: Layer order is everything. Here's a visual that finally made it click for me 👇 When you run docker build, Docker executes each Dockerfile instruction and saves the result as an immutable snapshot — a layer. Change something early in the file? Every layer after it rebuilds. No cache. No shortcuts. This is why the "right" Dockerfile pattern looks like this: 1. FROM base image 2. COPY package.json (just the manifest) 3. RUN npm install ← cached unless deps change 4. COPY . . ← your source code 5. RUN npm build If you flip steps 2 and 4, every code change triggers a full npm install. On a large project that's minutes, not seconds. I built an interactive layer visualizer to make this tangible — link in comments. What's the Docker mistake you wish someone had shown you earlier? #Docker #DockerTips #Dockerfile #DevOps #CloudNative #WebDevelopment #SoftwareEngineering #BuildInPublic #LearningInPublic #NodeJS #FrontendDev #ReactJS
To view or add a comment, sign in
-
I used to just npm install everything without thinking twice. It's what every tutorial used. It came with Node. Why question it? Then I started working on a real backend project and I actually had to think about it. So here's what I figured out about picking a package manager: 🔹 npm The default. Works everywhere, zero setup friction. If you're starting out or on a team that already knows it, there's honestly nothing wrong with npm. The industry uses it more than people admit. 🔹 yarn Came in when npm was slow and messy. Introduced lockfiles properly. Still solid, especially yarn berry with PnP. But its edge has gotten smaller as npm improved over the years. 🔹 pnpm My current pick for new projects. It uses a shared store so packages aren't duplicated across projects. Faster installs, less disk space, and strict by default so no phantom dependencies sneaking in. 🔹 bun Insanely fast. It's a runtime and package manager in one. Great for solo projects and experiments. But team familiarity and ecosystem maturity are real concerns when you're working in production. The question I kept coming back to though: what if you inherit a codebase on npm but later realize pnpm or bun would have been a better fit? Honestly I would not migrate mid-project unless there is a real pain point. Slow CI builds, disk pressure on a monorepo, phantom dependency bugs creeping in. The migration cost has to justify the gain. But on a new project? I'm picking intentionally now, not just going with the default. Senior devs, how do you make this call on real teams? Do you migrate or just stick with what's already there? #nodejs #typescript #webdev #backenddevelopment #javascript
To view or add a comment, sign in
-
Most production bugs I’ve seen around environment variables aren’t caused by missing values. They’re caused by misunderstanding what environment variables actually are. A few things that have burned teams I know: - Environment variables are an OS primitive. Every process gets a flat key-value copy of its parent’s environment. A copy, not a reference. What your child process changes, your parent never sees. - Your .env file does nothing on its own. Something has to read it. And most tools, including dotenv, will NOT overwrite a variable that already exists. So if your shell profile already exports DATABASE_URL, your .env is silently ignored. This is probably the most common “works on my machine” culprit. - Docker starts with a clean slate. It does not inherit your shell environment. If you’re not explicitly passing variables with -e or –env-file, the container doesn’t know they exist. - React’s process.env is not real runtime config. The bundler replaces every reference with a literal string at build time. That value is now hardcoded in the JavaScript your users download. If you put a secret in REACT_APP_anything or NEXT_PUBLIC_anything, it is not a secret anymore. - Build time and runtime are fundamentally different. A frontend bundle bakes values in at build time. An Express server reads the actual process environment when it starts. You can change a backend variable and restart. You cannot do that with a bundled frontend without rebuilding. Deleted a committed secret in the next commit? The key is still in git history. git show will find it. The only fix is to rotate the credential. The mental model that fixes most of this: secrets always runtime, never build time, never committed to source code. #SoftwareEngineering #DevOps #WebDevelopment #BackendDevelopment #JavaScript #Docker #NodeJS #EngineeringLeadership
To view or add a comment, sign in
-
-
🚀 npm vs pnpm — The Real Difference (From a Developer’s Daily Life) If you’re still using npm by default, this might save you gigabytes of space and hours of install time 👇 🧠 The Problem (Real Scenario) I was managing 5+ Laravel + Vite projects on my system. With npm: • Each project had its own node_modules • ~200MB per project 👉 Total ≈ 1GB+ used 😬 ⚡ Then I switched to pnpm Using pnpm: • Dependencies are stored once globally • Projects just use lightweight links 👉 Same 5+ projects: • Global store ≈ 200MB • Projects ≈ minimal space 👉 Total ≈ 250MB 🔥 📦 What actually changed? npm: • Copies dependencies into every project ❌ • Slower installs • More disk usage pnpm: • Stores dependencies once (global store) ✅ • Links them into projects • Faster installs ⚡ • Saves massive space 💾 🔁 Real Dev Workflow Difference With npm: npm install # slow… every time With pnpm: pnpm install # fast ⚡ (reuses cache) 🧠 Key Insight pnpm doesn’t install dependencies globally — it stores them globally and reuses them smartly 🚀 When pnpm shines most • Multiple projects (Laravel, React, Node apps) • Monorepos • CI/CD pipelines • Teams working on the same stack ⚠️ One rule 👉 Don’t mix npm and pnpm in the same project 👉 Stick to one package manager 💡 My takeaway Switching to pnpm gave me: • ⚡ Faster installs • 💾 Less disk usage • 🧠 Cleaner dependency management 👇 Your turn Are you still using npm or have you switched to pnpm? #webdevelopment #javascript #nodejs #laravel #vite #frontend #backend #devtools #programming
To view or add a comment, sign in
-
-
Stop typing "fix stuff" as your commit message. I got tired of doing this before every push... → Run Prettier manually → Check git status → Figure out what to stage → Stare at the screen thinking of a commit message → Type something lazy like "fix stuff" and move on So I built something to fix it. Introducing PushPrep: an npm CLI tool that handles your entire pre-push workflow in one command. Here's what it does: ✅ Formats your code with Prettier automatically ✅ Shows staged vs unstaged files clearly ✅ Interactive file staging (all or pick specific) ✅ Gemini AI generates 3 smart commit messages for you ✅ You pick one → it commits → you push You just run git push after. That's it. No more lazy commit messages. No more skipping Prettier. No more breaking your flow state before shipping. Install it in 2 steps: npm install -g pushprep pushprep config --key YOUR_GEMINI_API_KEY Then just run pushprep inside any git project. (Free Gemini API key → https://lnkd.in/gh2f8hCC) Not sure what to do next? Just run: pushprep --help It shows you every available command with real usage examples — from saving your API key to running the full workflow. Clean, fast, no digging through docs. 📦 npm: https://lnkd.in/gUawP8j8 Would love your feedback — bugs, ideas, feature requests, anything. Drop them in the comments or open an issue. #npm #nodejs #javascript #opensource #buildinpublic #developer #webdevelopment #programming #git #cli
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development