Just finished building a hands-on CI/CD learning project with GitHub Actions and Docker, So I created a small Node.js app and wired up a full 3-stage pipeline: ✅ Run tests automatically on every push 🐳 Build a Docker image if tests pass 🚀 Deploy to a VPS via SSH if the image builds No manual SSH. No "deploy and pray 🙏🥹" Just push to main and let the pipeline do the work. Also wrote detailed notes covering DevOps fundamentals, the CALMS framework, Docker deep dives, and GitHub Actions from scratch — because understanding the why matters as much as the how. https://lnkd.in/g67akgKU
CI/CD Pipeline with GitHub Actions and Docker
More Relevant Posts
-
🚨 DevOps Learning: When GitHub Rejects Your Push Because of Large Files Today I ran into an interesting Git issue while pushing my DevSecOps project to GitHub from an EC2 instance. Everything looked fine locally, but GitHub rejected my push with this error: GH001: Large files detected The reason? Some binaries were accidentally committed to the repository: • argocd-linux-amd64 (205 MB) • awscliv2.zip (63 MB) • kubectl (55 MB) GitHub limits file sizes: ⚠ Recommended: 50 MB ❌ Maximum: 100 MB So the push failed. 🔧 The Fix 1️⃣ Remove the files from Git tracking git rm --cached <file> 2️⃣ Add them to .gitignore 3️⃣ Clean the Git history because large files still exist in previous commits git filter-branch --force --index-filter 'git rm --cached --ignore-unmatch <file>' --prune-empty --tag-name-filter cat -- --all 4️⃣ Force push the cleaned history git push origin main --force 💡 DevOps Best Practice Never commit binaries like: • kubectl • awscli zip files • ArgoCD binaries Instead install them via scripts or package managers in your setup pipeline. This was a great reminder that Git tracks history, not just current files. Every small issue in DevOps is a learning opportunity 🚀 #DevOps #Git #GitHub #Kubernetes #ArgoCD #Terraform #LearningByDoing
To view or add a comment, sign in
-
Learning Docker for DevOps engineer 🐳 As someone transitioning into DevOps, I knew I needed to really understand containerization—not just follow tutorials, but build something real. Project: https://lnkd.in/dKuRrRrm 🎯 What I Built: ✅ Multi-container app (Java + Node.js + PostgreSQL + Nexus) ✅ Docker Compose orchestration ✅ Multi-stage builds (800MB → 250MB!) ✅ Custom networks for service isolation ✅ Persistent volumes (learned this the hard way) ✅ Deployed to DigitalOcean droplets 💡 Concepts That Clicked: 🔹 Containers ≠ VMs Completely different paradigm. VMs virtualize hardware. Containers virtualize the OS. 🔹 Multi-stage builds Build dependencies don't belong in production images. My Java app dropped from 800MB to 250MB. 🔹 Docker networks Services discover each other by name. Java app reaches Nexus at `http://nexus:8081`. No IP configs needed. 🔹 Volumes save lives Lost my entire Nexus repository once when I restarted a container. Volumes = data that survives. 📚 Learning Journey: Week 1: Breaking everything "Why does my container exit immediately?" "Where's my database data?" "How do containers communicate?" Week 2: Everything clicks Multi-stage builds, networks, volumes—it all makes sense now. 🛠️ Tech Stack: 🐳 Docker & Docker Compose ☕ Java (Maven) 🟢 Node.js 🐘 PostgreSQL 📦 Nexus Repository 🔧 Nginx ☁️ DigitalOcean 🎓 Skills Gained: - Writing efficient Dockerfiles - Orchestrating multi-container apps - Managing persistent data - Container networking - Cloud deployment (DigitalOcean) - Debugging containerized apps 📖 Project Includes: ✓ Documented Dockerfiles (with WHY, not just WHAT) ✓ Docker Compose setup ✓ Volume & networking examples ✓ DigitalOcean deployment guide ✓ Mistakes I made + fixes ✓ Security basics 💭 Real Talk: This is a learning project, not production-ready. But it gave me hands-on experience with Docker concepts that matter in DevOps. Learning by building beats following tutorials every time. 🎯 Next Steps: - Kubernetes orchestration - CI/CD with Jenkins - Terraform for IaC - Monitoring setup For anyone learning DevOps: build something, break it, fix it, repeat. That's how concepts stick. Check it out: https://lnkd.in/dKuRrRrm Fellow learners: What project made Docker click for you? 👇 #DevOps #Docker #LearningInPublic #Containerization #CloudEngineering #CareerTransition
To view or add a comment, sign in
-
🐳 Docker Commands Explained: run, ps, and stop (with Nginx) When you start learning Docker, there are three commands that form the foundation of almost everything you'll do: • docker run — start a container • docker ps — see running containers • docker stop — stop a container safely If you understand these three, you already understand the basic lifecycle of a container. In this beginner-friendly guide, we walk through these commands step by step using a real example with Nginx, so you can actually see a container running in your browser. Inside the tutorial you’ll learn: • What happens when you run docker run nginx • Why containers sometimes block your terminal • How -d, -p, and --name flags work • How to check running containers with docker ps • The right way to stop containers using docker stop 📖 Read the full tutorial: https://lnkd.in/gdNX7UHX 💡 After learning the basics, the best way to truly understand Docker is by practicing with real containers. You can register on Docker HOL and explore hands-on Docker labs designed for students, beginners, and developers. ✔ Practical Docker exercises ✔ Real command-line workflows ✔ Step-by-step container labs 🔗 Start learning Docker: https://dockerhol.com #Docker #DevOps #LearnDocker #DockerCommands #Containerization #DockerTutorial
To view or add a comment, sign in
-
-
🚀 From Learning to Running My First Docker Container Today! 🐳 Today I took another step in my DevOps learning journey by exploring Docker, one of the most widely used tools in modern application deployment. One common challenge developers face is: 👉 It works on my machine, but not on the server. Today I understood how Docker solves this problem by using containers. 💡 What is Docker? Docker is a containerization platform that packages an application along with all its dependencies into a lightweight container, ensuring the application runs consistently across different environments. 📚 Key Concepts I Learned Today 🔹 Docker Image – Blueprint used to create containers 🔹 Docker Container – Running instance of an image 🔹 Dockerfile – Script used to build Docker images 🔹 Docker Hub – Registry to store and share images 🔹 Port Mapping – Connecting the host machine to container services ⚙️ Hands-on Commands I Practiced docker --version docker pull nginx docker images docker run -d -p 8080:80 nginx docker ps docker ps -a docker logs <container_id> docker stop <container_id> docker rm <container_id> 🔗 Practical Experiment I Did I successfully ran an Nginx container and connected it with my host machine using port mapping. After running the container, I accessed it in my browser using: 👉 http://localhost:8080 Seeing the container run successfully and accessing it from the browser was a great hands-on learning experience. #Docker #DevOps #Containerization #LearningInPublic #CloudComputing #TechJourney #FutureDevOpsEngineer
To view or add a comment, sign in
-
-
Day 14 – Running Your First Docker Container Day 14 of my 30-Day DevOps learning journey. Today I focused on running my first Docker container and understanding how applications run inside containers. What is a Docker Container? A Docker container is a lightweight, portable environment that includes: Application code Runtime System libraries Dependencies This ensures the application runs the same on any system. Steps to Run Your First Container 1. Pull an Image from Docker Hub Docker Hub is the public registry where Docker images are stored. docker pull nginx 2. Run the Container docker run -d -p 80:80 nginx Explanation: -d → Run container in background -p 80:80 → Map container port to host port Now the application runs inside a container. Check Running Containers docker ps This command lists all running containers. Stop a Container docker stop <container-id> Why Containers Matter in DevOps • Faster deployments • Consistent environments • Easy scaling • Works perfectly with CI/CD pipelines Containers make it easier to move applications from development → testing → production without compatibility issues. Tomorrow: Docker Images & Dockerfile – How containers are built. Do follow me for more content on DevOps. Please checkout my GitHub Repo and give your suggestion i have created basic projects - https://lnkd.in/gXTYxXXm A Special thanks to Shubham Londhe & Abhishek Veeramalla for the guidance and the tutorials. #DevOps #Docker #Containers #CICD #CloudComputing #AWS #Jenkins #Kubernetes #Linux #DevOpsEngineer #TechLearning
To view or add a comment, sign in
-
-
🚀 DevOps Learning Series | Docker Fundamentals with Shubham Londhe Docker Client vs Engine vs Daemon (or: Who actually does the work in Docker? 🐳) Using Docker sometimes feels like magic. You type one command… and suddenly containers appear like instant noodles. But behind the curtain, three characters run the show. 🧑💻 Docker Client – The Talkative One This is what you interact with. When you run commands like: docker run nginx docker build . docker ps You’re talking to the Docker Client. Think of it as the waiter taking your order. ⚙️ Docker Engine – The Brain The Docker Engine is the core system that makes containers possible. It manages: • Images • Containers • Networks • Volumes If Docker were a restaurant, Engine = the kitchen system that coordinates everything. 🤖 Docker Daemon (dockerd) – The Worker The Docker Daemon is the background service actually doing the work. It listens for requests from the client and then: • Builds images • Runs containers • Pulls images from registries • Manages container lifecycle Basically: Client says → “Run container” Daemon says → “Alright, let me cook.” 🍳 🔁 How it all works together 1️⃣ You type a command → docker run nginx 2️⃣ Docker Client sends request to Docker Daemon 3️⃣ Docker Daemon uses the Docker Engine to create the container 4️⃣ Container starts running 🐳 Simple flow: Client → Daemon → Engine → Container 💡 TL;DR Docker Client → You type commands Docker Daemon → Executes the work Docker Engine → Core system managing containers You give the order. Docker quietly builds a tiny universe. And suddenly your laptop is running 15 containers and a dream. #DevOps #Git #VersionControl #SoftwareEngineering #LearningInPublic #DeveloperTools #TechTips #Docker #TrainWithShubham
To view or add a comment, sign in
-
-
🐳 Docker exec: How to Get Inside a Running Container When you have a container running, one of the most common questions is: How do I actually get inside it? That’s exactly what the docker exec command is for. With docker exec, you can run commands inside a running container or open an interactive shell to inspect files, debug issues, or explore the environment. In this beginner-friendly tutorial, we break the command down step by step so it’s easy to understand and practice. Inside the guide you'll learn: • What docker exec actually does • The difference between docker run and docker exec • Why -it is used for interactive shells • How to open a shell inside a container using bash or sh • Running quick commands inside containers without entering a shell 📖 Read the full tutorial: https://lnkd.in/gKAQgNzu If you're learning Docker, the best way to understand commands like this is by trying them yourself. You can register on Docker HOL and explore hands-on Docker labs designed for students, beginners, and developers. ✔ Practice real Docker commands ✔ Interactive learning labs ✔ Step-by-step container exercises 🔗 Start learning Docker: https://dockerhol.com #Docker #DockerExec #DevOps #LearnDocker #Containerization #DockerTutorial
To view or add a comment, sign in
-
-
Day 12/30 – Docker Learning Series Docker Exec and Interactive Containers Today I explored how to interact with running containers, which is an essential skill for debugging and managing applications in Docker. Running a container is not always enough. In real-world scenarios, we often need to go inside a container to inspect files, check processes, or troubleshoot issues. --- What is docker exec? The docker exec command is used to run commands inside a running container. Basic syntax: docker exec <container_id> <command> --- Open Interactive Terminal Inside a Container docker exec -it <container_id> /bin/bash Explanation: -i → Interactive mode -t → Allocates a terminal /bin/bash → Opens a shell inside the container If bash is not available (like in Alpine images), use: docker exec -it <container_id> /bin/sh --- Example Run an Nginx container: docker run -d --name mynginx nginx Enter the container: docker exec -it mynginx /bin/bash Now you are inside the container and can run Linux commands. --- Run One-Time Commands Inside Container docker exec mynginx ls /usr/share/nginx/html This runs a command without opening a full terminal. --- What are Interactive Containers? Interactive containers allow you to interact directly with the container’s shell. Example: docker run -it ubuntu /bin/bash This starts a container and immediately opens a terminal. --- Exit from Container Type: exit This will close the container session. --- Key Takeaways • docker exec allows access to running containers • Useful for debugging and inspecting applications • Interactive mode helps simulate real server environments • Essential skill for troubleshooting in DevOps Being able to enter and inspect containers is critical when working with production systems. --- Day 12/30 – Docker Learning Series Next: Dockerfile Introduction and Writing Your First Dockerfile #Docker #DevOps #Containerization #CloudComputing #CICD #Infrastructure #SRE #LearningInPublic #TechLearning #NetworkToDevOps
To view or add a comment, sign in
-
Day 15 – Docker Images & Dockerfile Basics Day 15 of my 30-Day DevOps learning journey. Today I learned about Docker Images and Dockerfile, which are used to build containers in Docker. What is a Docker Image? A Docker Image is a read-only template that contains everything needed to run an application: Application code Runtime environment Libraries and dependencies System tools When we run an image, it creates a Docker Container. What is a Dockerfile? A Dockerfile is a text file that contains instructions used to build a Docker image. It defines: Base image Application files Dependencies Commands to run the application Example Dockerfile FROM node:18 WORKDIR /app COPY . . RUN npm install CMD ["node", "app.js"] Explanation: FROM → Base image WORKDIR → Working directory inside container COPY → Copy project files into container RUN → Install dependencies CMD → Run the application Build a Docker Image docker build -t myapp . Run the Container docker run -d -p 3000:3000 myapp Docker Images and Dockerfiles make applications portable, consistent, and easy to deploy, which is why they are widely used in DevOps and CI/CD pipelines. Tomorrow: Docker Volumes & Data Persistence Do follow me for more content on DevOps. Please see my GitHub Account - https://lnkd.in/gXTYxXXm #DevOps #Docker #Containers #CICD #CloudComputing #AWS #Jenkins #Kubernetes #Linux #DevOpsEngineer #TechLearning Shubham Londhe TrainWithShubham Abhishek Veeramalla
To view or add a comment, sign in
-
-
🚀 Hands-on with Jenkins CI Pipelines this week! Spent some time building and debugging pipelines while learning how CI systems actually work behind the scenes. Two small projects, but a lot of practical learning. 🔹 Project 1: My First Jenkins Pipeline Started with a simple pipeline to understand the basics of how Jenkins executes tasks. What I explored: ✅ Jenkins pipeline structure (pipeline → stages → steps) ✅ Running shell commands through Jenkins ✅ Debugging builds using Console Output Even a basic pipeline helps you understand how CI servers automate workflows. 🔹 Project 2: Multi-Stage, Multi-Agent Pipeline Next step was implementing a more realistic CI pipeline using Docker agents. Pipeline stages: ⚙️ Back-end Stage → Runs inside a Maven Docker container 🎨 Front-end Stage → Runs inside a Node Docker container Pipeline architecture looked like this: GitHub Repository ↓ Jenkins ↓ Docker Container (Maven) ↓ Docker Container (Node) Each stage spins up its own container environment, executes commands, and shuts down automatically. This ensures: 🔹 Isolated builds 🔹 No dependency conflicts 🔹 Reproducible environments 💡 Biggest takeaway: Most of the learning came from debugging failures -missing tools, plugin issues, path problems, Docker permissions, etc. That’s where the real DevOps understanding develops. 🛠 Tech Stack Used ⚡ Jenkins ⚡ Git ⚡ Docker ⚡ AWS EC2 (Free Tier) ⚡ Linux #DevOps #Jenkins #Docker #CICD #CloudComputing #AWS #LearningInPublic
To view or add a comment, sign in
-
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development