🚀 Day 7/100 – Shell Scripting Basics Tired of running the same commands again and again? 😅 👉 That’s exactly why Shell Scripting exists. 🔍 What is Shell Scripting? Shell scripting is writing a series of Linux commands in a file that can be executed automatically. 👉 Instead of doing tasks manually, you automate them with scripts ⚡ ⚙️ Basic Example #!/bin/bash echo "Starting deployment..." git pull origin main docker build -t my-app . docker run -d -p 80:80 my-app echo "Deployment complete 🚀" 👉 Save → Run → Done ✅ 💡 Why Shell Scripting is Important in DevOps ✅ Automate repetitive tasks ✅ Reduce human errors ✅ Speed up deployments ✅ Glue different tools together 🛠️ Must-Know Concepts 🔹 Variables name="DevOps" echo "Hello $name" 🔹 Conditionals if [ -f "file.txt" ]; then echo "File exists" fi 🔹 Loops for i in 1 2 3 do echo "Run $i" done ⚠️ Common Mistakes ❌ Missing execution permission 👉 chmod +x script.sh ❌ Wrong shebang 👉 Always use #!/bin/bash ❌ Not handling errors 👉 Can break automation silently 📌 Real-World Use Case Deploying an app: Pull latest code Build Docker image Run container 👉 All in one script 📌 Key Takeaway 👉 Shell scripting = automation superpower for DevOps If you can script it… you don’t have to repeat it 🚀 💬 What’s the most useful script you’ve written so far? #DevOps #ShellScripting #Linux #Automation #100DaysOfDevOps #LearningInPublic
Arpit Goyal’s Post
More Relevant Posts
-
⚡ Day 16 – Shell Scripting Fundamentals in DevOps Today’s focus was on making my scripts more interactive and reliable. I worked through: 🔹 User input with read — capturing numbers, strings, and service names directly from the terminal. 🔹 Conditional logic (if-else) — comparing numbers, checking strings, and handling multiple cases with elif. 🔹 File checks (-f) — writing a script that asks for a filename and confirms if it exists. 🔹 Service checks (systemctl) — building a script that asks for a service name and reports if it’s active or not. 🔹 Troubleshooting syntax errors — learning why spaces around [ ] and using the right operators (= for strings, -eq for numbers) are critical in Bash. 💡 Key takeaway: Shell scripting is unforgiving about syntax, but that’s what makes it powerful. Every space, operator, and quote matters — and mastering these details builds the foundation for automation in DevOps. #90DaysOfDevOps #Day15 #DevOpsKaJosh #TrainWithShubham #Linux #ShellScripting #LearnInPublic
To view or add a comment, sign in
-
-
Bash Scripting for DevOps — Part 11/? Till now, I was passing values using environment variables. That worked. But I realized something. Sometimes, I don’t want to set variables separately.I just want to pass values directly when running the script. That’s where arguments come in. In Bash, we can pass values like this: ./deploy.sh staging Inside the script, we can access it using: echo "Deploying to $1 environment" Here, $1 means the first argument. So if I run: ./deploy.sh prod It becomes: Deploying to prod environment This makes scripts much more flexible. Instead of editing the script or setting variables, I can just pass what I need at runtime. This is used a lot in real DevOps workflows: • passing environment names • passing versions or tags • controlling script behavior dynamically Small change. But now the script feels more like a real tool, not just a fixed set of commands. #DevOps #BashScripting #Linux #Automation #DevOpsJourney #LearningInPublic
To view or add a comment, sign in
-
𝐇𝐞𝐲 𝐞𝐯𝐞𝐫𝐲𝐨𝐧𝐞! 👋 After thoroughly covering the Linux essentials over the last few months, I’ve been diving deep into the world of Bash Scripting. I recently revisited a Git repository I started a while back. While it initially only housed basic commands, I’ve shifted my focus to building functional automation scripts. 𝐓𝐡𝐢𝐬 𝐰𝐞𝐞𝐤’𝐬 𝐡𝐢𝐠𝐡𝐥𝐢𝐠𝐡𝐭: 𝐀 𝐃𝐢𝐬𝐤 𝐂𝐡𝐞𝐜𝐤𝐮𝐩 𝐀𝐮𝐭𝐨𝐦𝐚𝐭𝐢𝐨𝐧 𝐒𝐜𝐫𝐢𝐩𝐭. It monitors disk space and alerts when thresholds are met. While writing it, I realized that running a commands is only half the battle—the real magic lies in Command Chaining and Output Manipulation. 💡 𝙆𝙚𝙮 𝙏𝙖𝙠𝙚𝙖𝙬𝙖𝙮 If you want to master Bash, you have to master text manipulation. Tools like sed, grep, awk, and cut are essential. They allow you to take raw system data and mold it into exactly what your script needs to make a decision. 📢 𝐒𝐭𝐚𝐫𝐭𝐢𝐧𝐠 𝐚 𝐍𝐞𝐰 𝐒𝐞𝐫𝐢𝐞𝐬 𝐨𝐧 𝐃𝐞𝐯.𝐭𝐨! I’m excited to announce that I’m starting a series on Dev.to! I’ll be sharing: * Detailed insights into the scripts I’m writing. * The logic behind the automation. * Resources I’m using to learn. * Regular updates to my GitHub repository. If you’re also on your Bash journey or looking to dive into DevOps, let's connect! Check out my first few updates and let’s automate the world together. 🚀 Dev.to : https://lnkd.in/gKvsaKUZ Github: https://lnkd.in/dhfKuw9s #Linux #BashScripting #DevOps #Automation #LearningJourney #DevTo #OpenSource #TechCommunity
To view or add a comment, sign in
-
-
🚀 End-to-End Deployment Automation with Bash & Docker I recently worked on a hands-on DevOps challenge where I automated the deployment of a Django Notes application using a Bash script — and it was a great learning experience! 🎯 Objective: Build a complete deployment pipeline using simple Bash scripting. 🛠️ What I implemented: 🔹 Cloned the application repository 🔹 Installed all required dependencies 🔹 Built a Docker image for the application 🔹 Pushed the image to Docker Hub 🔹 Deployed the container on a VM 🔹 Verified the deployment by checking container status 💡 Key Takeaways: ✔️ Learned how to structure Bash scripts for real-world automation ✔️ Improved understanding of Docker workflows (build, tag, push, run) ✔️ Gained practical exposure to deployment on a virtual machine ✔️ Understood how small scripts can simplify repetitive DevOps tasks Thanks Shubham Londhe for the repo "django-notes-app". I am able to deploy this application on my ubuntu-server VM. It was a good learning experience. 😊 #DevOps #BashScripting #Docker #Automation #CloudComputing #Linux
To view or add a comment, sign in
-
💻 Exploring Shell Scripting: Small Commands, Big Impact Another step forward in my DevOps journey 🚀 Shell scripting is more than just writing commands — it’s about: ✔️ Automating repetitive tasks ✔️ Improving efficiency ✔️ Building scalable workflows 🔑 Key areas I worked on: • Bash scripting & execution • Variables and arguments • Control structures (if, for, while) • Automating daily tasks 💡 Why it matters? Because automation is the backbone of DevOps — saving time, reducing errors, and ensuring consistency. “The best way to predict the future is to automate it.” #ShellScripting #DevOps #Automation #Linux #ContinuousLearn
To view or add a comment, sign in
-
-
🚀 Day 19 of #90DaysOfDevOps journey with Shubham Londhe Today, I worked on a practical DevOps-style project focused on automation and system maintenance using Bash scripting. Instead of manually managing logs and backups, I built a system that handles everything automatically. 🔧 What I built: 📁 Log Rotation Script - Compresses logs older than 7 days - Deletes archives older than 30 days 💾 Backup Script - Creates timestamped backups - Verifies backup success using size output - Maintains a 14-day retention policy ⏱ Crontab Automation - Log rotation runs daily - Backups run weekly - Health checks run every 5 minutes 🧩 Maintenance Wrapper Script - Combines all tasks into one workflow - Logs everything for easier debugging 📚 Key Learnings: - Importance of validation to avoid script failures - Using "find -mtime" for automated cleanup - Redirecting logs ("2>&1") for better troubleshooting - Understanding the power of cron jobs in real-world automation This project gave me a deeper understanding of how real systems handle logs, backups, and reliability without manual effort. Step by step, I’m becoming more confident in Linux, Bash, and DevOps fundamentals 💪 #90DaysOfDevOps #DevOpsKaJosh #Linux #BashScripting #Automation #Crontab #LearningJourney #TrainWithShubham
To view or add a comment, sign in
-
Day 20 of learning and practicing DevOps 🔁 Worked on scripting project — building a log analyzer and report generator Worked on: • Reading and validating log file input • Counting errors (ERROR, Failed) using grep • Extracting critical events with line numbers • Finding top 5 recurring errors using awk, sort, uniq • Generating a structured report file • Archiving processed logs automatically Important part: Instead of manually reading logs, I built a script that analyzes everything and gives a summary in seconds. Learning today--> logs tell the story turning raw logs into useful insights. Here are my notes: https://lnkd.in/ga8xUT6U 📍 #DevOps #Linux #ShellScripting #Automation #LogAnalysis #LearningInPublic #90DaysOfDevOps #TrainWithShubham
To view or add a comment, sign in
-
I've been building BashX: a Bash framework that brings real structure to shell scripting. If you've ever worked on a project where shell scripts grew into an unmaintainable mess, this was built for you. What BashX does: - Convention-based loader: drop a .xsh file, get an @-prefixed function automatically - no registration, no boilerplate - Lazy loading of 120+ built-in utilities with zero startup cost - Auto-generated CLI help from ## doc comments in source files - A built-in testing framework (@@assert.*) - no external dependencies - Lifecycle event system: ready → start → error → finish - One-command project scaffolding - Cross-platform: Linux, macOS, Android Shell No compile step. No runtime dependencies. Just Bash, structured. # Bootstrap a new project in one command: ./bashx _bashx init project v1.0.0 my-app # Your action is live immediately: ./my-app deploy --env production Documentation, tests, help output — everything lives alongside the code, in the code. I built this because shell scripting deserves the same engineering standards we apply to everything else. 👉 Try it out: https://lnkd.in/erBsd5xF If you write Bash for DevOps, automation, or CLI tooling — I'd love your feedback. --- Drop a ⭐ if you find it useful, or open an issue if something breaks. Both are equally welcome!
To view or add a comment, sign in
-
Ansible Conditionals: Using "when" Statements to Control Task Execution 🎯 Today's task introduced one of Ansible's most important control flow features, the when conditional statement. The goal was to copy different files to different app servers from a single playbook that runs against all hosts simultaneously, with each task only executing on its intended server and being skipped on the others. This is where Ansible starts to feel less like a simple automation tool and more like a proper programming language. The ability to gather facts about remote systems and then make decisions based on those facts is what allows a single playbook to behave differently across an entire fleet of servers without duplicating code. The key variable today was ansible_nodename a fact that Ansible automatically collects from each remote host during the fact-gathering phase. By comparing this value in a when condition, each task knew exactly which server it was meant to run on and quietly skipped the rest. Seeing those skipping messages in the output was actually satisfying because it confirmed the conditionals were working exactly as intended. What I reinforced today: - Using when conditionals to control which tasks run on which hosts in a multi-server playbook - The difference between ansible_nodename and inventory_hostname, and when to use each - How Ansible's fact gathering phase powers conditional logic, no facts means no conditionals - Using ansible -m setup -a "filter=ansible_nodename" to verify the exact value Ansible sees before writing conditions - Understanding that skipping in the play output is not an error, it is confirmation that conditionals are working correctly - Always running --check mode first when working with file copy tasks that have specific permission and ownership requirements 💡 Writing one playbook that intelligently handles multiple servers differently is far more scalable than writing separate playbooks for each server. Conditionals are what make that possible. Still learning. Still building. Still pushing forward! 🚀🎯 #DevOps #Ansible #Automation #Linux #AnsibleConditionals #ConfigurationManagement #InfrastructureAsCode #ContinuousLearning #TechJourney #Day93 #100DaysOfDevOps
To view or add a comment, sign in
-
-
🚀 Building a Mini CI Pipeline Using Bash (From Scratch) Over the past few days, I’ve been deepening my understanding of Linux by moving beyond isolated commands and focusing on practical automation. What started as a simple exercise—parsing log files with grep—evolved into building a structured, pipeline-like workflow using Bash. Here’s what I implemented: 🔹 Input validation to ensure robustness (handling missing/invalid directories) 🔹 Dynamic log scanning across multiple .log files 🔹 Error and warning aggregation using grep and wc 🔹 Identification of recurring issues using: cut → sort → uniq → sort -nr 🔹 Stage-based execution to simulate pipeline behavior: [STAGE 1] → [STAGE 4] 🔹 Status classification (OK / WARNING / CRITICAL) based on thresholds 🔹 Exit codes (0, 1, 2) to represent machine-level decisions 🔹 Dual output handling using tee (terminal + report file) 💡 A key insight from this exercise: There’s a fundamental distinction between: • Human-readable output → “STATUS: WARNING” • Machine-readable signals → exit 1 This is precisely how CI/CD systems determine whether to proceed, warn, or halt execution. ⚠️ One subtle but important lesson: Using tee -a without resetting the file led to duplicated reports — a small oversight, but a valuable reminder of how state management impacts automation reliability. What this project reinforced for me is that: DevOps is not about memorizing tools. It’s about designing workflows, enforcing logic, and enabling systems to make decisions autonomously. Next, I’ll be extending this into a basic alerting system, moving closer to real-world monitoring scenarios. If you’re on a similar path, I’d strongly recommend: Don’t just learn commands — engineer processes with them. #DevOps #Linux #Bash #Automation #CICD #Scripting #LearningInPublic
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development