📅 #100DaysOfDevOps – Day 25 Continuing my #100DaysOfDevOps learning journey. 🔹 Day 25 Focus: Docker Basics & Commands Today I started learning Docker, which is widely used for containerization in DevOps. 🔹 What is Docker? Docker is a platform used to build, package, and run applications in containers, making them portable and consistent across different environments. 🔹 Key Concepts • Image – Blueprint of an application • Container – Running instance of an image • Dockerfile – Script to build images • Docker Hub – Repository to store images 🔹 Basic Docker Commands • docker --version – Check Docker version • docker pull image-name – Download image from Docker Hub • docker images – List images • docker run image – Run a container • docker run -d -p 1111:80 --name cont-1 image-name – Run container in background with port mapping and name • docker ps – Running containers • docker ps -a – All containers • docker stop container_id – Stop container • docker start container_id – Start container • docker rm container_id – Remove container • docker rmi image_id – Remove image Learning Docker is an important step towards building and managing containerized applications. Step by step moving deeper into DevOps tools and practices 🚀 #100DaysOfDevOps #Docker #DevOps #Containers #CICD #LearningJourney #ContinuousLearning #DevOpsJourney #TechGrowth #KeepLearning
Docker Basics & Commands for DevOps
More Relevant Posts
-
Day 13/30 – Docker Learning Series Dockerfile Introduction and Writing Your First Dockerfile Today I started working with Dockerfiles, which is a major step toward real DevOps practices. Until now, I was using pre-built images. But in real scenarios, we need to create our own custom images based on application requirements. This is where Dockerfiles come in. --- What is a Dockerfile? A Dockerfile is a text file with a set of instructions used to build a Docker image. It automates the process of: Setting up the environment Installing dependencies Copying application code Running the application --- Basic Structure of a Dockerfile Here is a simple example: FROM nginx COPY . /usr/share/nginx/html --- Common Dockerfile Instructions FROM Defines the base image RUN Executes commands during build COPY Copies files from local system to container WORKDIR Sets the working directory CMD Defines the default command to run --- Create Your First Dockerfile Step 1: Create a file named Dockerfile Step 2: Add content: FROM nginx COPY . /usr/share/nginx/html Step 3: Build the image: docker build -t myimage . --- Run the Custom Image docker run -d -p 8080:80 myimage Now your custom container is running. --- Why Dockerfiles Matter • Automates image creation • Ensures consistency across environments • Makes deployments repeatable • Essential for CI/CD pipelines --- Key Takeaway Dockerfiles help move from using containers to building production-ready containerized applications. This is where DevOps practices truly begin. --- Day 13/30 – Docker Learning Series Next: Dockerfile Instructions Deep Dive (RUN, CMD, ENTRYPOINT) #Docker #DevOps #Containerization #CloudComputing #CICD #Infrastructure #SRE #LearningInPublic #TechLearning #NetworkToDevOps
To view or add a comment, sign in
-
While learning DevOps, Docker is one tool that keeps showing up everywhere - and today I finally got a clear idea of how it actually works. What is Docker? Docker is a platform that helps you package an application along with all its dependencies so it runs the same in any environment. What is a Docker Image? A Docker image is like a blueprint or template. It contains everything needed to run an application — code, libraries, dependencies, configs. Think of it like: a class or a snapshot What is a Container? A container is a running instance of an image. When you run an image, it becomes a container. Think of it like: an object created from a class How They Work Together: You build an image → using a Dockerfile You run the image → it creates a container You can run multiple containers from one image Image vs Container (Simple Difference): Image = Static (template) Container = Dynamic (running app) Image = Read-only Container = Read + Write (runtime changes) My Thought: This concept really cleared up a lot of confusion. Understanding the difference between image and container makes Docker feel much easier and more logical. Still learning, but this feels like a solid step into real DevOps #Docker #DevOps #Containerization #LearningJourney #CloudComputing #Networking #AWS
To view or add a comment, sign in
-
🚀Docker Fundamentals — The Backbone of Modern DevOps In today’s fast-paced development world, consistency and scalability are everything. That’s where Docker comes in — making it easy to build, ship, and run applications anywhere. 🔹 What is Docker? Docker is a containerization platform that allows you to package an application along with its dependencies into a lightweight, portable unit called a container. 🔹 Why Docker matters? ✅ Eliminates “works on my machine” issues ✅ Faster deployments and rollbacks ✅ Lightweight compared to virtual machines ✅ Ensures consistency across dev, test, and prod 🔹 Key Concepts: 📦 Image – Blueprint of your application 📦 Container – Running instance of an image 📦 Dockerfile – Script to build images 📦 Docker Hub – Registry to store and share images 🔹 Basic Workflow: 1️⃣ Write a Dockerfile 2️⃣ Build an image 3️⃣ Run a container 4️⃣ Push to registry (optional) 🔹 Simple Commands: docker build -t my-app . docker run -d -p 8080:80 my-app docker ps docker stop <container_id> 💡 Real-world Insight: Using Docker in CI/CD pipelines ensures every build runs in the same environment — reducing failures and improving reliability. 🔥 Whether you’re deploying microservices or scaling applications, Docker is a must-have skill in your DevOps toolkit. #Docker #DevOps #Containerization #CloudComputing #CICD #SoftwareEngineering #Learning #TechCommunity
To view or add a comment, sign in
-
-
🚀 Day 9 of my DevOps Learning Journey 🐳 Docker Image vs Container 👉 Image = Blueprint (read-only template) 👉 Container = Running instance (live application) 💡 In simple terms: A Docker image is like the code/package, and a container is the execution of that code. 🧠 Think of it like: 📦 Image = App package ⚡ Container = App running on your system 🔥 Key Differences: ✔ Image is immutable (cannot be changed) ✔ Container is mutable (can be modified at runtime) ✔ Image is built once, used many times ✔ Container is created from image and runs the app ✔ Multiple containers can run from a single image ⚙️ Commands to remember: 👉 docker build → creates image 👉 docker run → starts container 🔥 Why it matters: ✔ Strong foundation in Docker concepts ✔ Helps in debugging container issues ✔ Essential for CI/CD pipelines ✔ Enables consistency across environments ✔ Core building block for Kubernetes #Docker #DevOps #Containers #LearningJourney
To view or add a comment, sign in
-
-
🚀 Day 26 of Day 30 - #DevOps Today I focused on one of the core building blocks of Kubernetes — Pods, and explored how they behave in real scenarios. Here’s what I implemented and learned 👇 🔹 Created a basic Pod using YAML 🔹 Added Annotations to store metadata 🔹 Built a Multi-Container Pod (multiple containers inside one Pod) 🔹 Worked with Environment Variables in Pods 🔹 Practiced kubectl logs and kubectl exec commands 🔹 Debugged real-time errors during setup 💡 Key Takeaways: Pod is the smallest deployable unit in Kubernetes Multiple containers inside a Pod share the same network Annotations help attach extra information to resources Environment variables make containers dynamic YAML formatting is extremely sensitive (small mistake = error ❌) ⚠️ Challenges I faced: Invalid naming issues (RFC rules) YAML parsing errors Containers exiting immediately (Completed state) Incorrect kubectl exec syntax (missing --) 🔥 Biggest Learning: Kubernetes is easy to run, but hard to debug — and that’s where real learning happens. 📌 Hands-on Work: ✔️ Created and managed Pods ✔️ Verified logs and container behavior ✔️ Explored container lifecycle 🚀 Slowly moving from just learning commands → understanding how Kubernetes actually works internally 🔗 GitHub: https://lnkd.in/gegA3q9F #Kubernetes #DevOps #30DaysDevOpsChallenge #LearningInPublic #Minikube #Docker #CloudComputing
To view or add a comment, sign in
-
Most Docker content stops at “run a container.” This one intentionally doesn’t. In real DevOps environments, Docker is never just a tool — it’s a mindset shift. Once you move past commands and start understanding how systems behave under containers, you begin to think differently about applications, infrastructure, and scale. This video is built around that transition. Instead of memorizing syntax, we connect how Docker actually fits into production workflows — how services communicate, how environments stay consistent, and how teams design systems that don’t break when they move across stages. We start with the fundamentals, but not in isolation. Every concept is tied back to why it exists in real systems: - Why containerization changed deployment thinking - Why Docker’s architecture matters beyond theory - Why images are more than build artifacts — they are deployable units of intent - Then we move into what actually defines production readiness: - Networking that connects real services, not just examples - Docker Compose as a way to model systems, not scripts - CI/CD and deployment patterns that reflect how teams ship software today But the most important layer isn’t technical. It’s decision-making. Because in real projects, knowing what to use matters more than knowing how to use everything. That’s where most learners get stuck — and where engineers start to stand out. You’ll also hear lessons from real mistakes, confusion points, and the kind of questions that don’t show up in documentation but show up in interviews and production incidents. By the end, Docker stops being a topic you “learn” and becomes a lens you think through — where applications are no longer abstract, but containerized systems with behavior, limits, and design trade-offs. This is for anyone who’s ready to move from learning tools… to understanding systems. 📌 Before you start the series: Fork the repo: https://lnkd.in/gBKPEA3U Subscribe on YouTube: / @techwithher Notes: https://lnkd.in/gNgwh4eB https://lnkd.in/ggA2cxct
DOCKER for DevOps | FREE NOTES + Project Handson | TechWithHer | #AyushiSingh
https://www.youtube.com/
To view or add a comment, sign in
-
#DevOps #CICD #YAML #GitHubActions#Automation Understanding GitHub Actions YAML (CI/CD made simple) If you’re new to DevOps, this YAML file might look confusing. It’s actually simple. This file tells GitHub: When something happens → run these steps automatically name → what the workflow is called Just a label you see in GitHub Example: CI pipeline on → when the pipeline starts push → runs when you push code pull_request → runs when you open or update a PR branches → which branch triggers it (like main) Simple: when should this run? jobs → the work GitHub will do You can have one or many jobs job name (you decide this) Example: build, test, deploy You choose the name — it’s not fixed runs-on → the machine used Example: ubuntu-latest GitHub creates a temporary machine for your pipeline steps → list of tasks These run one by one Inside steps: name → label for the step (what you see in logs) uses → use a ready-made action run → run your own command What actually happens: You push code GitHub reads this YAML file It creates a machine It runs each step You see results in the Actions tab Simple way to remember: When → Where → What When = on Where = runs-on What = steps This is the foundation of CI/CD. Later you replace the steps with real things like: Terraform (init → validate → plan) Docker (build → tag → push) AWS (deploy → monitor) Start simple. Then build on it. #DevOps #CICD #YAML #GitHubActions#Automation CoderCo
To view or add a comment, sign in
-
-
🚀 DAY 21 of DevOps Learning Journey This focused on YAML & GitOps (GitHub Actions)🔥 Let’s delve in👇 💡 What is YAML? YAML stands for “YAML Ain’t Markup Language” (yes, funny name 😄). Originally, it meant “Yet Another Markup Language” — but it evolved into something more powerful. 👉 It is a human-readable data serialization format used to write configurations. --- ✨ Why YAML is Popular YAML is designed to be: ✔️ Easy to read (even for non-devs) ✔️ Simple to write and edit ✔️ Clean and structured using indentation (like Python) --- 🛠️ Where YAML is Used (Real DevOps Use Cases) 🔹 CI/CD Pipelines Used in GitHub Actions (`.yml` files) to automate workflows ⚙️ 🔹 Kubernetes Defines Pods, Deployments, and Services 📦 🔹 Docker Used in `docker-compose.yml` for multi-container applications 🐳 🔹 Infrastructure as Code (IaC) Used alongside tools like Terraform to manage infrastructure 🌍 --- 📚 YAML Basics (Syntax Rules) 🟡 Indentation matters (spaces, not tabs!) 🟡 Case-sensitive 🟡 Comments start with `#` --- 🧩 Structure Example ``` name: John Doe age: 30 ``` --- 📊 Data Types in YAML 🔸 Strings 🔸 Numbers 🔸 Booleans (true/false) 🔸 Null (`null` or `~`) 🔸 Dates (advanced usage) --- 🧱 YAML Structures You Should Know 📌 Lists (Arrays) 📌 Dictionaries (Key-Value pairs) 📌 Nested structures 📌 Multiline strings --- ⚡ Quick Insight YAML is the backbone of modern DevOps automation — especially when working with GitOps using GitHub Actions. Mastering it means you’re one step closer to building and automating real-world cloud systems ☁️ --- 🔥 Lesson Learned: Small syntax mistakes (like wrong indentation) can break everything — attention to detail is key! --- 📈 Day 21 Done! We keep learning, we keep building 💪 #DevOps #CloudComputing #YAML #GitOps #GitHubActions #100DaysOfCloud #TechJourney #LearningInPublic 🚀
To view or add a comment, sign in
-
-
Still deploying code manually? It’s time to automate! 🛑🚀 Whether you are just starting out with DevOps or looking to refine your current pipelines, I’ve put together a comprehensive CI/CD Cheat Sheet to help you streamline your software delivery. Swipe through the document below to explore: 🔹 The core concepts of Continuous Integration, Delivery, and Deployment 🔹 A visual journey of a robust CI/CD pipeline 🔹 Essential tools to know (Git, Docker, Kubernetes) 🔹 Quick-start syntax for GitHub Actions, GitLab CI/CD, and Jenkins 🔹 Advanced concepts like Secrets Management and Shift-Left Security Let’s build faster, more reliable software with fewer human errors. 💻 Found this useful? Repost to share it with your network, and let me know in the comments: What is your go-to CI/CD tool right now? 👇 #CICD #DevOps #SoftwareEngineering #Automation #GitHubActions #Jenkins #Kubernetes #TechCareers #SoftwareDevelopment #TechCommunity To gain a more detailed understanding, I recommend these platforms: w3schools.com Udemy Coursera
To view or add a comment, sign in
-
🚀 8 Docker Best Practices Every DevOps Beginner Should Know When I started learning DevOps through the Tech with Nana bootcamp, Docker quickly became one of my favorite tools. But beyond just using Docker, I discovered that how you use it matters a lot. Here are 8 essential best practices (and why they matter): 1. Use official Docker images as base images They’re maintained, trusted, and regularly updated, reducing security risks and unexpected issues. 2. Use specific image versions (avoid 'latest') Ensures consistency across environments. What works today won’t suddenly break tomorrow. No sudden failures and surprises. 3. Use small-sized images Smaller images result in faster builds, quicker deployments, and reduced attack surface. 4. Optimize caching of image layers Proper layer ordering speeds up rebuilds and saves time during development. 5. Use .dockerignore Prevents unnecessary files (like .git, logs, node_modules) from bloating your image. 6. Leverage multi-stage builds Keep your final image clean by separating build dependencies from runtime. 7. Use the least privileged user Avoid running containers as a root user. This is a simple but powerful security practice. This is a core principle in the Linux ecosystem. 8. Scan images for vulnerabilities Identify and fix security issues early before they reach production. Don't leave cracks for bad actors to exploit. Key takeaway: Docker isn’t just about containerizing apps; it’s about doing it securely, efficiently, and reliably. If you're transitioning into DevOps (especially from a non-tech background like I did), mastering these fundamentals can really set you apart. What Docker best practice has made the biggest difference in your workflow? #DevOps #Docker #CloudComputing #TechWithNana #LearningJourney #BeginnerFriendly
To view or add a comment, sign in
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
Once the basic commands feel comfortable, try running a real app with a Dockerfile and port mapping - that’s where Docker starts to move from commands to real understanding.