🐳 Dockerfile Optimization Tip for Faster Builds I recently saw a YouTube short teaching Dockerfiles like this 👆: "⚠️Old / Less Optimal Version": dockerfile FROM python:3.9-slim WORKDIR /app COPY . . RUN pip install -r requirements.txt CMD ["python", "app.py"] ✅ Works, but not efficient. Problem: Docker builds images in layers, caching each instruction. In the old version: -COPY . . copies all project files early. -Any tiny change (even README or comments) invalidates this layer. -This forces pip install -r requirements.txt to rerun every build — slowing down development. "✅Correct / Optimized Version": dockerfile FROM python:3.9-slim WORKDIR /app COPY requirements.txt . RUN pip install -r requirements.txt COPY . . CMD ["python", "app.py"] Why it’s better: -Copying requirements.txt first allows Docker to cache the install layer. -Only the final COPY . . layer rebuilds when you change code. -Frequent code tweaks don’t trigger unnecessary dependency reinstall — much faster iterative builds. 💡 Small changes like this make a big difference when you’re frequently rebuilding images during development. For anyone learning DevOps, Python, or containerization, mastering Docker caching and layers is essential. #Docker #Python #DevOps #CI_CD #Containerization #SoftwareEngineering #DockerTips #PythonDev #BestPractices #LearningEveryday
Kiran Kumar V’s Post
More Relevant Posts
-
Wanna learn 𝗗𝗼𝗰𝗸𝗲𝗿? 📦 Let's write your first Dockerfile in 𝟱 𝘀𝘁𝗲𝗽𝘀 👨💻↓ 𝗪𝗵𝗮𝘁 𝗶𝘀 𝗮 𝗗𝗼𝗰𝗸𝗲𝗿𝗳𝗶𝗹𝗲? A Dockerfile is a text file that entirely defines the environment where your Python application runs, for example your ML traning script. From a Dockerfile you can create a Docker image $ 𝐝𝐨𝐜𝐤𝐞𝐫 𝐛𝐮𝐢𝐥𝐝 -𝐭 𝐦𝐲-𝐢𝐦𝐚𝐠𝐞 . And from a Docker image you can spin up a Docker container $ 𝐝𝐨𝐜𝐤𝐞𝐫 𝐫𝐮𝐧 𝐦𝐲-𝐢𝐦𝐚𝐠𝐞 Let me show you how to write a Dockerfile for your ML training scripts. 𝗘𝘅𝗮𝗺𝗽𝗹𝗲 This is your initial folder structure, with an empty Dockerfile 𝐦𝐥-𝐩𝐫𝐨𝐣𝐞𝐜𝐭 ↳ 𝐃𝐨𝐜𝐤𝐞𝐫𝐟𝐢𝐥𝐞 ↳ 𝐫𝐞𝐪𝐮𝐢𝐫𝐞𝐦𝐞𝐧𝐭𝐬.𝐭𝐱𝐭 ↳ 𝐬𝐫𝐜 ↳ 𝐭𝐫𝐚𝐢𝐧.𝐩𝐲 Let's write our Dockerfile, in 5 steps 𝗦𝘁𝗲𝗽 1️⃣ 𝗦𝗲𝗹𝗲𝗰𝘁 𝘆𝗼𝘂𝗿 𝗯𝗮𝘀𝗲 𝗹𝗮𝘆𝗲𝗿 ↳ 𝗙𝗥𝗢𝗠 𝗽𝘆𝘁𝗵𝗼𝗻:𝟯.𝟭𝟬.𝟯-𝘀𝗹𝗶𝗺-𝗯𝘂𝘀𝘁𝗲𝗿 This is the foundation upon which you build subsequent layers. If you use Python I recommend you use an official python image, like python:3.11-slim 𝗦𝘁𝗲𝗽 2️⃣ 𝗦𝗲𝘁 𝗲𝗻𝘃𝗶𝗿𝗼𝗻𝗺𝗲𝗻𝘁 𝘃𝗮𝗿𝗶𝗮𝗯𝗹𝗲 𝗱𝗲𝗳𝗮𝘂𝗹𝘁𝘀 ↳ 𝗘𝗡𝗩 𝗣𝗬𝗧𝗛𝗢𝗡𝗨𝗡𝗕𝗨𝗙𝗙𝗘𝗥𝗘𝗗=𝟭 In this case, setting PYTHONUNBUFFERED ensures all the Python code output shows up in real-time on our Docker console 𝗦𝘁𝗲𝗽 3️⃣ 𝗖𝗼𝗽𝘆 𝘀𝗼𝘂𝗿𝗰𝗲 𝗳𝗶𝗹𝗲𝘀 𝗶𝗻𝘁𝗼 𝘁𝗵𝗲 𝗰𝗼𝗻𝘁𝗮𝗶𝗻𝗲𝗿 ↳ 𝗖𝗢𝗣𝗬 . 𝗮𝗽𝗽/ Copy the requirements.txt file and your source code files from your local disk into the container. 𝗦𝘁𝗲𝗽 4️⃣ 𝗜𝗻𝘀𝘁𝗮𝗹𝗹 𝗣𝘆𝘁𝗵𝗼𝗻 𝗱𝗲𝗽𝗲𝗻𝗱𝗲𝗻𝗰𝗶𝗲𝘀 ↳ 𝗥𝗨𝗡 𝗽𝘆𝘁𝗵𝗼𝗻𝟯 -𝗺 𝗽𝗶𝗽 𝗶𝗻𝘀𝘁𝗮𝗹𝗹 --𝗻𝗼-𝗰𝗮𝗰𝗵𝗲-𝗱𝗶𝗿 -𝗿 𝗿𝗲𝗾𝘂𝗶𝗿𝗲𝗺𝗲𝗻𝘁𝘀.𝘁𝘅𝘁 pip install all the Python dependencies your code needs to run 𝗦𝘁𝗲𝗽 5️⃣ 𝗦𝗲𝘁 𝘁𝗵𝗲 𝗲𝗻𝘁𝗿𝘆𝗽𝗼𝗶𝗻𝘁 ↳ 𝗘𝗡𝗧𝗥𝗬𝗣𝗢𝗜𝗡𝗧 ["𝗽𝘆𝘁𝗵𝗼𝗻𝟯", "𝘀𝗿𝗰/𝗺𝗮𝗶𝗻.𝗽𝘆"] This is the default command to run when you spin up your container. In this case, you kick off your training job. BOOM! You training script is now ready for production 🚀 ---- Hi there! It's Pau Labarta Bajo 👋 Every day I share free, hands-on content, on production-grade ML, to help you build real-world ML products. 𝗙𝗼𝗹𝗹𝗼𝘄 𝗺𝗲 and 𝗰𝗹𝗶𝗰𝗸 𝗼𝗻 𝘁𝗵𝗲 🔔 so you don't miss what's coming next #machinelearning #realtimedata #mlops #realworldml #docker
To view or add a comment, sign in
-
-
🐍 Python's Dependency Management: Mastering Venvs, Pip, & the MODERN Ecosystem! 📦✨ Ever felt lost in Python's dependency jungle? 🐍 You're not alone! Mastering dependency management is CRUCIAL for robust, reproducible, and collaborative projects. Forget 'it works on my machine!' 👋 At its core, it's about ENVIRONMENT ISOLATION. This is where virtual environments (like venv) shine brightest. They create dedicated spaces for your project dependencies, preventing conflicts and ensuring your projects remain self-contained. Always start with a venv! ✨ `pip` is our package installer, and `requirements.txt` lists our direct dependencies. But this simple approach has its limits. Ever struggled with TRANSITIVE DEPENDENCIES or ensuring exact reproducibility across environments? 🤔 That's where the old way shows its cracks. Enter the MODERN tools: `Poetry` and `Pipenv`! 🚀 These tools go beyond `pip` by introducing: ➡️ Lock Files: Pinning ALL dependencies (direct & transitive) to exact versions, ensuring IDENTICAL builds every time. ➡️ Dependency Resolution: Smartly handling conflicts. ➡️ Simplified Workflow: Managing virtual environments, installing, and packaging, all in one place. Using `pyproject.toml` (PEP 518/621) as the central configuration for your project is the way forward! It unifies build systems, dependencies, and metadata. Are you still relying solely on `pip freeze > requirements.txt` or have you embraced tools like Poetry? Share your workflow and best practices! 👇 #Python #PythonDev #DependencyManagement #SoftwareDevelopment #DevOps
To view or add a comment, sign in
-
-
#30DaysOfContainers — Day 4/30 𝗪𝗵𝘆 𝗗𝗼𝗰𝗸𝗲𝗿 𝗜𝗺𝗮𝗴𝗲𝘀 𝗔𝗿𝗲 𝗦𝗼 𝗟𝗶𝗴𝗵𝘁𝘄𝗲𝗶𝗴𝗵𝘁 — 𝗧𝗵𝗲 𝗦𝗲𝗰𝗿𝗲𝘁 𝗼𝗳 𝗟𝗮𝘆𝗲𝗿𝘀 The first time I built a Docker image, it felt like magic. A whole environment, packaged neatly into 200 MB… But here’s the real secret: Docker images aren’t just big ZIP files. They’re built in layers. Let’s break it down: Every instruction in your Dockerfile adds a new layer. 𝗘𝘅𝗮𝗺𝗽𝗹𝗲: Here’s what happens 𝘍𝘙𝘖𝘔 𝘱𝘺𝘵𝘩𝘰𝘯:3.9 → 𝘉𝘢𝘴𝘦 𝘭𝘢𝘺𝘦𝘳 𝘊𝘖𝘗𝘠 𝘳𝘦𝘲𝘶𝘪𝘳𝘦𝘮𝘦𝘯𝘵𝘴.𝘵𝘹𝘵 → 𝘕𝘦𝘸 𝘭𝘢𝘺𝘦𝘳 𝘙𝘜𝘕 𝘱𝘪𝘱 𝘪𝘯𝘴𝘵𝘢𝘭𝘭 → 𝘈𝘯𝘰𝘵𝘩𝘦𝘳 𝘭𝘢𝘺𝘦𝘳 𝘊𝘖𝘗𝘠 . . → 𝘈𝘯𝘰𝘵𝘩𝘦𝘳 𝘰𝘯𝘦 𝘊𝘔𝘋 ["𝘱𝘺𝘵𝘩𝘰𝘯", "𝘢𝘱𝘱.𝘱𝘺"] → 𝘍𝘪𝘯𝘢𝘭 𝘪𝘯𝘴𝘵𝘳𝘶𝘤𝘵𝘪𝘰𝘯 Each layer only stores what changed from the previous one. 𝗪𝗵𝘆 𝗶𝘁 𝗺𝗮𝘁𝘁𝗲𝗿𝘀: • Reusability: Common layers are shared across images. (So your python:3.9 base isn’t downloaded every time.) • Speed: When you rebuild an image, Docker reuses unchanged layers — making builds blazing fast • Efficiency: Storage and network use drop dramatically. Keep frequently changing files (like source code) toward the bottom of your Dockerfile. That way, Docker caches everything above it and rebuilds only what’s needed #Docker #DevOps #Containers #SoftwareEngineering #Dockerfile #30DaysOfContainers
To view or add a comment, sign in
-
Your documentation examples are broken. You just don't know it yet. After the third email from someone saying "your example doesn't work," I realized I had a problem. I'd refactored a function, updated all my tests, pushed to production. Everything worked perfectly. Except the docstring still showed the old API. The code was fine. The documentation was garbage. So I spent a Saturday building a tool to catch this automatically. What I built: Scans Python files for docstring examples (the ones with >>> markers) Runs them to check for errors Reports what broke 100 lines of code, zero dependencies What I learned: Documentation is code. Test it like code. Python's AST module is more powerful than most people realize Simple tools are underrated - this does one thing well The standard library has everything you need (no pip install required) The first version crashed on Unicode characters. Took an hour to debug because I forgot to specify UTF-8 encoding. One parameter. That's the kind of lesson you only learn by actually building things. I ran this on my production code and found 7 broken examples in 5 minutes. All from refactorings where I changed function signatures but forgot to update docstrings. This tool would have caught every single one before they hit users. The stats: 150 lines total (including error handling) 4 hours to build 0 external dependencies 7 bugs found immediately 0 angry emails since deployment If you write Python and your docstrings have examples, you probably have broken ones. I did, and I wrote the code. The complete build breakdown, all the code, and what went wrong along the way. The bottom line: Ship simple tools that solve real problems. Don't wait for the perfect solution. Build something that works, document what it doesn't do, and improve it later if you actually need to. #Python #SoftwareDevelopment #Coding #Programming #DevTools #TechnicalWriting #Documentation #SoftwareEngineering #100DaysOfCode #LearnToCode
To view or add a comment, sign in
-
I just finished up creating an automated File Verification workflow that is helpful for any company who utilizes GitHub repos for their work. This project showed me how powerful automation can be when you combine the right tools. By linking GitHub Actions, Python, and CloudWatch, I created a system that enforces repository standards without anyone having to manually check each pull request. The Python script handles the validation logic, GitHub Actions runs it automatically at the right times, and CloudWatch captures everything for audit purposes. The biggest lesson I learned was that debugging CI/CD pipelines requires patience — especially when dealing with things like timestamp formatting and shell quote parsing that work fine locally but break in the workflow environment. Each error taught me something about how these systems interact, and troubleshooting the CloudWatch logging step gave me hands-on experience with AWS CLI commands and IAM permissions. What I built here is scalable too. If Level Up Bank wanted to check for additional files or add more validation rules, I could easily expand the Python script’s required files list or add new validation logic. The same workflow structure would handle it without major changes. This pipeline demonstrates a practical approach to maintaining code quality standards across multiple repositories while keeping leadership informed through centralized logging — exactly what modern platform engineering teams need.
To view or add a comment, sign in
-
🚀 Day 6 of My Python & AI Journey – Exploring Data Structures + Mastering Git & GitHub! Today marks Day 6 of my Python learning series, where I explored one of the most essential topics — Data Structures 🧩. From Lists and Tuples to Sets and Dictionaries, I practiced how to efficiently store, modify, and iterate through data in Python. Alongside coding, I also focused on enhancing my understanding of Git & GitHub — the backbone of modern software development 💻. Version control is not just about pushing code; it’s about collaborating smartly, managing projects efficiently, and maintaining clean workflows. To deepen this skill, I’ve written an upcoming article on Medium Platform: 🎯 “Mastering Git & GitHub for Python Projects: A Practical Guide” This guide walks through the installation process, key commands, workflows, and practical examples of how to use Git and GitHub effectively in Python projects. If you’re learning Python or working on personal projects, this will help you manage your code like a professional developer. 💬 I’d love to hear your thoughts or suggestions that could help improve my learning journey. Also, if you’re an IT professional interested in Python, AI, or DevOps trends, let’s connect — I’m open for evening or weekend discussions! 🔗 GitHub Repository: https://lnkd.in/edMxe9nV 📝 Medium Article: https://lnkd.in/e2fD8AVU #Python #AI #GitHub #Git #VersionControl #CodingJourney #DataStructures #LearningInPublic #MediumArticle #PythonProjects #Developers #DevOps
To view or add a comment, sign in
-
Why FastAPI is Becoming the Go-To Framework for Modern Python APIs If you’ve been working with Python for a while, you’ve probably noticed an important shift: APIs are now at the core of most applications. FastAPI is becoming a preferred choice for many developers and teams, and here is why it stands out: 1. High Performance FastAPI is built on ASGI and supports asynchronous programming, which allows it to deliver performance comparable to Node.js and Go. In modern applications, speed is not a luxury anymore; it is a necessity. 2. Built-in Interactive Documentation As soon as you start your FastAPI application, you get clean, interactive documentation generated automatically through Swagger and ReDoc. No additional configuration, no manual setup. 3. Strong Data Validation FastAPI uses Pydantic for data validation and type management. This reduces bugs, improves clarity, and ensures more reliable APIs without extra work. 4. Aligned with Modern Python Standards FastAPI embraces type hints, async functionality, and clean, readable code. It feels natural for developers who appreciate clarity and structure. 5. Scales Effectively Whether you are building a small personal project or a large microservices architecture, FastAPI scales smoothly. It is trusted in production environments by companies such as Microsoft, Uber, and Netflix. If you are a Python developer looking to build APIs with less boilerplate, better structure, and faster development time, FastAPI is worth exploring. #FastAPI #Python #APIs #PythonDevelopers
To view or add a comment, sign in
-
-
For a long time, I wrote almost everything in Rust, precise, strict, beautifully unforgiving. I did not build every system in Rust; I preferred it and delegated other stacks when it made sense. Over time, one thing became clear: real technical leadership requires breadth as much as depth. Every modern engineer should think fluently in at least three languages: 🦀 Rust: for building fast, reliable systems. Its strict type system and compiler discipline teach real engineering thinking. 🟦 TypeScript: once just a frontend tool, now the foundation of the growing AI SDK ecosystem (Claude, OpenAI, LangChain, and many others). 🐍 Python: the operating system of applied AI. Most agent frameworks are written in Python, which makes it the easiest path to assemble, run, and iterate on real agent workflows. It is also the default for model training and local ML. How to put this into practice this week: * Rust: ship a tiny CRUD service or CLI with Axum, SQLx, Serde. One health check, one write, one read. * TypeScript: use the Claude SDK to build a simple PDF text analyzer that returns structured JSON. * Python: deploy both using Pulumi for Python. Pulumi is an infrastructure-as-code tool like Terraform that supports many languages; use the Python SDK to define one stack, manage secrets cleanly, and get repeatable builds. Fluency across these three turns you from a single-instrument player into a conductor. You design the score, assign the parts, and ship on tempo. That is the skill set I hire for, develop in my teams, and hold myself accountable to.
To view or add a comment, sign in
-
🚀 Good ideas needs good engineering! While transitioning from academia to industry, I realized that 𝘄𝗿𝗶𝘁𝗶𝗻𝗴 𝗴𝗿𝗲𝗮𝘁 𝗰𝗼𝗱𝗲 𝗶𝘀𝗻’𝘁 𝗲𝗻𝗼𝘂𝗴𝗵 — the additional key steps is 𝗯𝘂𝗶𝗹𝗱𝗶𝗻𝗴 𝗶𝘁 𝘁𝗵𝗲 𝗿𝗶𝗴𝗵𝘁 𝘄𝗮𝘆. That insight led me to create 👉 𝗽𝘆𝘁𝗵𝗼𝗻-𝗽𝗿𝗼𝗷𝗲𝗰𝘁-𝘁𝗲𝗺𝗽𝗹𝗮𝘁𝗲 — a lightweight but complete framework for professional-grade Python projects. Here’s what it includes: 🧩 reproducible environments (pyproject.toml) 🧪 automated testing, linting & typing ⚙️ CI/CD workflows for tests, docs, and releases But beyond the tools, this project is about mindset. I wanted to deepen my understanding of robust, maintainable software development — how automation, reproducibility, and structure make ideas scalable and collaborative. These are the skills I’m excited to keep building and applying in data-driven R&D and engineering roles. 💬 Your turn: What’s one practice or tool that has most improved how you build or share your Python projects? GitHub repository: https://lnkd.in/dxBR4jr6 #Python #SoftwareEngineering #OpenSource #CareerGrowth #CleanCode #DevOps
To view or add a comment, sign in
-
Leveling up my Python skills through building real projects I’m working on small Python projects to learn by doing, not just watching tutorials. My goals: - AI: use Python to build LLM/RAG prototypes and simple model services - Automation: write scripts/CLIs to remove manual work in DevOps/Cloud Latest mini-project: a number guessing game (1–100) - if your guess is too high → “Too high” - If your guess is too low → “Too low” - If input isn’t a valid number between 1 and 100 → “Please enter a valid number” How I’m learning now: I’m using VS Code, building increasingly harder projects, and then containerizing them with Docker. If you have great Python resources for AI or automation, share them in the comments. I would like to try them #Python #AI #Automation #DevOps #Cloud #VSCode #Docker
To view or add a comment, sign in
-
More from this author
-
Claude Can Now Handle SEO Like a $10,000/Month Premium Agency. And It Won't Cost You a Single Rupee.
Kiran Kumar V 2d -
The Broken Bridge: Why CWV Fixes Fail Before They Start
Kiran Kumar V 5d -
Singapore Job Market Crisis: Sharpest Drop in Postings in 5 Years – MOM Q4 Report & Indeed Analysis
Kiran Kumar V 2w
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development