I pushed my code. And my Docker image was live on Docker Hub — automatically. No terminal. No manual docker build. No docker push. Just a git push, and GitHub Actions did the rest. That moment hit different. 🚀 This is what Day 45 of #90DaysOfDevOps looked like for me. I built a full CI/CD pipeline for the Flask Task Manager app. But I didn't just make it "work" — I made it work smart: ⚙️ Triggers only on push to main Feature branches? PRs? They build but never push. Only clean, reviewed code makes it to Docker Hub. 🏷️ Two tags on every push Every image gets tagged as :latest AND :sha-xxxxxxx So you always know exactly which commit is running in production. 🔐 Secrets, not hardcoded credentials Docker Hub username and token stored as GitHub secrets. Nothing sensitive ever touches the codebase. ✅ End-to-end — no manual steps git push → checkout → docker build → docker push → Done. This is exactly how real teams ship software. GitHub: https://lnkd.in/gNSri6tZ #90DaysOfDevOps #DevOpsKaJosh #TrainWithShubham #Docker #GitHubActions #CICD #DevOps #LearningInPublic
Automated CI/CD Pipeline with GitHub Actions and Docker
More Relevant Posts
-
Most Go developers are using GitHub Actions wrong — and it’s costing them speed, reliability, and clarity. I wrote this to break that pattern. The problem isn’t YAML. It’s the way we think about CI/CD. In this article, I walk through: – Why most pipelines become messy and fragile – The mindset shift from “scripts in YAML” → “clean orchestration” – How to structure Go workflows for speed, simplicity, and production reliability – What senior engineers do differently when designing CI Good CI isn’t about making builds pass. It’s about making systems trustworthy. If your pipeline feels slow, confusing, or brittle — this is for you. Read here: https://lnkd.in/dtYNh43T #golang #githubactions #devops #backend #softwareengineering #cicd
To view or add a comment, sign in
-
Why I choose Docker Hub for the deployment instead of GitHub 🚀 When it's time to ship code, every developer faces a choice. Do I pull the source code from GitHub? Do I deploy a Docker Image from Docker Hub? For my project, I chose the Docker workflow. Here's why I rely on Docker Desktop and Docker Hub. The Technical Difference * GitHub is for storing source code. It has the instructions for your app. * Docker Hub is for storing container images. It has the environment, like the operating system, libraries, and code, ready to run. The Breakdown: Docker vs. GitHub for Deployment Why I Use Docker & Docker Hub: * I like consistency. By using Docker Desktop to build my image, I know it has every specific library version my app needs. * This eliminates the problem of "it works on my machine". The image that runs on my laptop is the same one that runs on the server. However, there is a limitation. Images are larger than code files. So the initial push to Docker Hub takes a bit of bandwidth. Why I Don't Use GitHub for the Final Deploy: * GitHub is great for collaboration and version control. It's where my code lives and grows. *. Deploying straight from GitHub means the server has to build the app. This includes installing dependencies. If a single external library update fails during that build, the whole deployment crashes. My Deployment Strategy I use Docker Desktop as my engine. I package everything into a "frozen" container. Pushing this to Docker Hub ensures that: * Deployment is instant. The server just. Runs. No installing, no compiling. * Rollbacks are easy. If something goes wrong, I pull an image tag from Docker Hub. * My infrastructure stays clean. I don't need to install Python, Node, or Java on my server. Just Docker. I use GitHub to manage my code. I use Docker to manage my runtime. It's the difference between sending someone a recipe and sending them the meal. Are you Team Docker or Team GitHub? #Docker #DockerDesktop #DevOps #SoftwareEngineering #DataScience #GitHub #Containerization #TechTalk
To view or add a comment, sign in
-
Happy to share my first CI/CD pipeline built with GitHub Actions The idea is simple the pipeline automatically runs tests and publishes a Docker image to Docker Hub every time I push a version tag like v1.0.0 Here's what's inside: * Tests run automatically on Node 18 and 20 in parallel * Dependencies are cached so the pipeline runs faster * Old pipeline runs get cancelled if a new one starts * Docker image gets pushed with 2 tags — the version and latest * Logs are saved automatically if anything fails * All credentials stored as GitHub secrets — nothing hardcoded Small project but it covers a bunch of real concepts: matrix strategy, caching, custom composite actions, and secrets management. Repo: https://lnkd.in/dTZYWGkr Full details and workflow breakdown in the README https://lnkd.in/dSawvYXc #GitHub #GitHubActions #DevOps #Docker #CICD #Learning
To view or add a comment, sign in
-
devpath-idp update — Phase 6. For those following along: I've been building an internal developer platform from the ground up and sharing the progress here. Phase 5 was software templates. Phase 6 is where things got interesting. One of the trickier debugging lessons I've had so far: A workflow completing successfully doesn't mean it actually did what you think it did. Here's what happened. Phase 6 is about the Backstage scaffolder — the part where a developer fills out a form and it automatically creates a GitHub repo, sets up the structure, and registers the service in the catalog. Self-service provisioning. That's the goal. I ran the template. The scaffolder showed steps completing. No obvious errors in the UI. It looked like it worked. Then I checked GitHub. The repo was empty. Something in the workflow had failed silently. The scaffolder didn't crash — it just didn't finish what it started. The GitHub push step timed out somewhere in the middle, and the UI wasn't loud about it. What made this tricky was the noise around it. When I restarted the backend during earlier sessions, the browser was throwing auth errors and stale token warnings. Those looked serious. They weren't — they were just the browser catching up after a restart. The real failure was quieter and further downstream. That's the thing about debugging distributed workflows: the loudest errors are often not the important ones. The important one here was silent — a repo that existed but had nothing in it. I'm still working through the fix. But the shift in understanding matters: I stopped asking "why is there an error?" and started asking "did this actually complete end to end?" Those are very different questions. And in platform engineering, the second one is usually the right one to ask first. Progress so far: ✅ Phase 1 — Base platform setup ✅ Phase 2 — GitOps foundation ✅ Phase 3 — Backstage portal setup ✅ Phase 4 — Catalog basics ✅ Phase 5 — Software templates 🔧 Phase 6 — Self-service provisioning (in progress) More on this when it's resolved. 🔧 #Backstage #PlatformEngineering #DevOps #InternalDeveloperPlatform #GitHub #Debugging #CloudEngineering #devpath
To view or add a comment, sign in
-
Have you ever confidently said, "But it works on my machine!" only to watch your code crash on your coworker's laptop? 😅 We’ve all been there. Conflicting software versions and missing dependencies can turn a great deployment into a total nightmare. That’s exactly why the tech world shifted to Docker and Containerization. 🚢 Instead of configuring every laptop and server manually, Docker lets you pack your code, libraries, and settings into one standard, portable "container." If it runs on your machine, it runs everywhere! To understand Docker, you just need to know its 4 main parts: 1️⃣ Docker Client: The CLI where you type your commands. 2️⃣ Docker Daemon: The background worker that actually builds and runs your containers. 3️⃣ Docker Engine: The core software suite combining the Client, Daemon, and API. 4️⃣ Docker Registry: Think of this as the "GitHub" for Docker. It’s where you store and share your containers (like Docker Hub)! Want the full story and a simple breakdown of how all this fits together? Check out my latest blog post here: https://lnkd.in/dS5XXsj6 🔗 How often do you use Docker in your current workflow? Let me know below! 👇 #Docker #Containerization #DevOps #SoftwareEngineering #Coding #TechExplained #WebDevelopment #DeveloperLife #Docker #Docker, Inc
To view or add a comment, sign in
-
🚀 Built an End-to-End CI/CD Pipeline using Jenkins & Docker I recently completed a project where I designed and implemented a fully automated CI/CD pipeline from scratch. 🔧 Tech Stack: - Git & GitHub - Jenkins - Docker - Node.js - ngrok (for webhook tunneling) - Postman (API testing) ⚙️ What the pipeline does: Every time I push code to GitHub: ➡️ Jenkins is triggered automatically via webhook ➡️ Latest code is pulled ➡️ Docker image is built ➡️ Old container is stopped ➡️ New container is deployed 🌐 Result: The application updates automatically on localhost without any manual deployment. 💡 Key Learnings: - Practical understanding of CI/CD pipelines - Docker containerization & deployment - Jenkins automation workflows - Debugging real-world DevOps issues 🔗 GitHub Repository: https://lnkd.in/gJdnMAYT This project helped me understand how real-world DevOps pipelines work in production environments. Would love feedback and suggestions! #DevOps #CICD #Docker #Jenkins #GitHub #Automation #CloudComp
To view or add a comment, sign in
-
The first time I pushed to GitHub after implementing GitOps in my home lab, I just sat there watching my terminal. I was not sure it was going to work. Then it did. The change applied itself. No kubectl apply. No manual anything. Flux just picked it up and made it happen. I had read about GitOps. I had watched videos about GitOps. But seeing your cluster update itself from a Git push is a completely different feeling. Here is what happened under the hood: - I pushed an updated manifest to GitHub. - Flux detected the change within seconds. - Flux compared what was in Git to what was running in the cluster. - A new pod spun up with the updated config. The old one terminated cleanly. I did not touch the cluster once. That was the moment I understood why teams and enterprises adopt GitOps as an industry standard. The cluster can only ever be in a state that exists in Git. No undocumented changes. No mystery configs. Every change has a commit. Every commit has a history. Every history has a reason. Once you work this way it is very hard to go back to doing things manually. Have you had a moment where a tool completely changed how you think about your workflow? 👇 Follow me, I am documenting everything I build and learn in my home lab. #GitOps #Kubernetes #DevOps #FluxCD #CloudNative
To view or add a comment, sign in
-
-
🐳 Top Docker Commands Every Developer Should Know If you're working with Docker, mastering a few core commands can make your workflow faster, cleaner, and more efficient. Here are some essential Docker commands every developer should know: 🔹 1. Check Docker Version docker --version 🔹 2. Pull an Image from Docker Hub docker pull nginx 🔹 3. List Images docker images 🔹 4. Run a Container docker run -d -p 3000:3000 node-app 🔹 5. List Running Containers docker ps 🔹 6. List All Containers (including stopped) docker ps -a 🔹 7. Stop a Container docker stop <container_id> 🔹 8. Remove a Container docker rm <container_id> 🔹 9. Remove an Image docker rmi <image_id> 🔹 10. View Logs docker logs <container_id> 🔹 11. Execute Command Inside Container docker exec -it <container_id> bash 🔹 12. Build an Image docker build -t my-app . 🔹 13. Docker Compose Up docker-compose up -d 🔹 14. Docker Compose Down docker-compose down 💡 Pro Tip You don’t need to memorize everything — but knowing these commands can cover 80% of real-world Docker use cases. Mastering Docker CLI is a big step toward becoming a DevOps-ready developer 🚀 #Docker #DevOps #Containerization #WebDevelopment #CloudComputing #CICD #SoftwareEngineering #BackendDevelopment #TechSkills #Programming
To view or add a comment, sign in
-
-
I just built my first CI/CD pipeline from scratch. It broke 10+ times before it worked. Here's what happened. As part of my DevOps bootcamp (TechWorld with Nana), I set up a complete Jenkins pipeline for a NodeJS application. The goal: every code change should be automatically tested, built into a Docker image, pushed to Docker Hub, and versioned — no manual steps. The setup: • Ubuntu VM running on VirtualBox • Jenkins running as a Docker container with the host's Docker socket mounted • Jenkinsfile defining 5 pipeline stages: version bump, test, build, push, commit • GitLab for source code, Docker Hub for images What went wrong (and what I learned): First, GitLab authentication broke the pipeline. My password had special characters that mangled the git URL. The fix: use a Personal Access Token instead. Lesson — never use passwords with special characters in CI/CD URLs. Tokens are safer and cleaner. Then, Jenkins couldn't build Docker images. "Permission denied" on the Docker socket. Even though the socket was mounted, Jenkins runs as a non-root user. The fix: chmod 666 on the socket inside the container. Lesson — mounting a socket isn't enough, the permissions have to match. The result: a fully automated pipeline that increments the app version, runs tests (pipeline stops if tests fail), builds a Docker image with the new version tag, pushes it to Docker Hub, and commits the version bump back to GitLab. One command triggers all of this. That's CI/CD. Biggest takeaway: CI/CD isn't just about writing a Jenkinsfile. It's about understanding how Jenkins, Docker, Git, and your application all connect. You're the bridge between all these tools. Module 8 progressing. Still building, still breaking things, still learning. #DevOps #Jenkins #CICD #Docker #Pipeline #GitLab #Automation #LearningInPublic #CareerChange #TechWorldWithNana
To view or add a comment, sign in
-
-
🚀 Getting Started with Docker (Beginner Friendly) Ever faced the classic problem: 👉 “It works on my machine 😅” That’s where Docker comes in! 🐳 🔹 Docker allows you to package your application with all dependencies 🔹 Run it anywhere – no environment issues 🔹 Lightweight alternative to virtual machines 💡 Basic Commands Every Beginner Should Know: ✔ docker --version ✔ docker pull nginx ✔ docker run -d -p 8080:80 nginx ✔ docker ps ✔ docker stop <container_id> 📦 In simple words: Docker = Your app + dependencies + environment → packed in one container As a developer, learning Docker is a **must-have skill in 2026** 💻 I’ve just started exploring it and it already feels powerful 🔥 👉 Are you using Docker in your projects? #Docker #DevOps #BackendDevelopment #JavaDeveloper #TechLearning #100DaysOfCode #EngineeringStudent
To view or add a comment, sign in
-
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development