Day 19 of #90DaysOfDevOps 💻🔥 Today I built my first real-world automation scripts using Shell Scripting. ✔ Created a log rotation script (cleanup + compression) ✔ Built a server backup script using .tar.gz ✔ Automated tasks using crontab scheduling 💡 Biggest learning: Automation isn’t just writing scripts — it’s about making systems run without manual effort. ⚡ Real-world DevOps use: These concepts are used in log management, server backups, and scheduled maintenance in production systems. From learning → to building → to automating 🚀 #DevOps #Linux #ShellScripting #Automation #Crontab #90DaysOfDevOps
Automation Scripts with Shell Scripting and Crontab
More Relevant Posts
-
Day 19 of learning and practicing DevOps 🔁 Today all about building actual automation scripts Worked on: • Log rotation script to compress and clean old logs • Backup script to create timestamped archives • Scheduling jobs using crontab • Combining everything into a maintenance script Important part: Understanding why log rotation and backups matter — without them, logs can fill up disk space and break systems. learning today --> automation + scheduling Instead of manually managing logs and backups, scripts + cron can handle everything in the background. This one is actually used in production environments. Here are my notes: https://lnkd.in/gwQUKK8b 📍 #DevOps #Linux #ShellScripting #Automation #Crontab #LearningInPublic #90DaysOfDevOps #TrainWithShubham
To view or add a comment, sign in
-
🚀 Day 19 of #90DaysOfDevOps journey with Shubham Londhe Today, I worked on a practical DevOps-style project focused on automation and system maintenance using Bash scripting. Instead of manually managing logs and backups, I built a system that handles everything automatically. 🔧 What I built: 📁 Log Rotation Script - Compresses logs older than 7 days - Deletes archives older than 30 days 💾 Backup Script - Creates timestamped backups - Verifies backup success using size output - Maintains a 14-day retention policy ⏱ Crontab Automation - Log rotation runs daily - Backups run weekly - Health checks run every 5 minutes 🧩 Maintenance Wrapper Script - Combines all tasks into one workflow - Logs everything for easier debugging 📚 Key Learnings: - Importance of validation to avoid script failures - Using "find -mtime" for automated cleanup - Redirecting logs ("2>&1") for better troubleshooting - Understanding the power of cron jobs in real-world automation This project gave me a deeper understanding of how real systems handle logs, backups, and reliability without manual effort. Step by step, I’m becoming more confident in Linux, Bash, and DevOps fundamentals 💪 #90DaysOfDevOps #DevOpsKaJosh #Linux #BashScripting #Automation #Crontab #LearningJourney #TrainWithShubham
To view or add a comment, sign in
-
Day 18 of #90DaysOfDevOps 💻🔥 Today I leveled up my Shell Scripting skills by learning how to write cleaner and reusable scripts. ✔ Created and used functions ✔ Worked with return values & local variables ✔ Learned strict mode (set -euo pipefail) for safer scripts ✔ Built intermediate scripts for real scenarios 💡 Biggest learning: Using strict mode helps catch errors early and makes scripts production-ready. ⚡ Real-world DevOps use: Functions + strict scripting are used in automation scripts, CI/CD pipelines, and system monitoring tools. Slowly moving from basic commands to real automation 🚀 #DevOps #Linux #ShellScripting #Automation #90DaysOfDevOps
To view or add a comment, sign in
-
Bash Scripting for DevOps — Part 11/? Till now, I was passing values using environment variables. That worked. But I realized something. Sometimes, I don’t want to set variables separately.I just want to pass values directly when running the script. That’s where arguments come in. In Bash, we can pass values like this: ./deploy.sh staging Inside the script, we can access it using: echo "Deploying to $1 environment" Here, $1 means the first argument. So if I run: ./deploy.sh prod It becomes: Deploying to prod environment This makes scripts much more flexible. Instead of editing the script or setting variables, I can just pass what I need at runtime. This is used a lot in real DevOps workflows: • passing environment names • passing versions or tags • controlling script behavior dynamically Small change. But now the script feels more like a real tool, not just a fixed set of commands. #DevOps #BashScripting #Linux #Automation #DevOpsJourney #LearningInPublic
To view or add a comment, sign in
-
Day 20 of learning and practicing DevOps 🔁 Worked on scripting project — building a log analyzer and report generator Worked on: • Reading and validating log file input • Counting errors (ERROR, Failed) using grep • Extracting critical events with line numbers • Finding top 5 recurring errors using awk, sort, uniq • Generating a structured report file • Archiving processed logs automatically Important part: Instead of manually reading logs, I built a script that analyzes everything and gives a summary in seconds. Learning today--> logs tell the story turning raw logs into useful insights. Here are my notes: https://lnkd.in/ga8xUT6U 📍 #DevOps #Linux #ShellScripting #Automation #LogAnalysis #LearningInPublic #90DaysOfDevOps #TrainWithShubham
To view or add a comment, sign in
-
💻 Exploring Shell Scripting: Small Commands, Big Impact Another step forward in my DevOps journey 🚀 Shell scripting is more than just writing commands — it’s about: ✔️ Automating repetitive tasks ✔️ Improving efficiency ✔️ Building scalable workflows 🔑 Key areas I worked on: • Bash scripting & execution • Variables and arguments • Control structures (if, for, while) • Automating daily tasks 💡 Why it matters? Because automation is the backbone of DevOps — saving time, reducing errors, and ensuring consistency. “The best way to predict the future is to automate it.” #ShellScripting #DevOps #Automation #Linux #ContinuousLearn
To view or add a comment, sign in
-
-
#Day10/90 of my DevOps Journey: Mastering Linux File Permissions! Today was all about the power of the command line—from creating and manipulating files with touch, echo, and cat to managing data flow with redirection operators (> vs >>). I dived deep into File Permissions, learning how to transform a simple text file into an executable script using chmod +x and the crucial "Shebang" (#!/bin/bash). Understanding the diffrence between overwriting and appending data is a small step for a dev, but a giant leap for script automation and system safety! https://lnkd.in/eZihHNBU
To view or add a comment, sign in
-
Day 17 of #90DaysOfDevOps 💻🔥 Today I dived deeper into Shell Scripting and learned how to make scripts more powerful and practical. ✔ Practiced for & while loops ✔ Worked with command-line arguments ($1, $#, $@) ✔ Built small scripts to automate tasks ✔ Learned basic error handling 💡 Biggest learning: Using arguments makes scripts reusable and dynamic — not just static commands. ⚡ Real-world DevOps use: Shell scripts are used daily for automation, deployments, monitoring, and backups. Consistency is building confidence 🚀 #DevOps #Linux #ShellScripting #Automation #90DaysOfDevOps
To view or add a comment, sign in
-
🚀 My 5-Days Journey Learning Shell Scripting | From Basics to Loops Over the past few days, I’ve been building my foundation in Linux shell scripting as part of my journey toward DevOps. Here’s what I covered: 🔹 Day 1: Variables – storing and reusing data 🔹 Day 2: User Input – making scripts interactive 🔹 Day 3: If Conditions – decision-making in scripts 🔹 Day 4: For Loops – automating repetitive tasks 🔹 Day 5: While Loops – continuous execution and monitoring 💡 Key Takeaway: Shell scripting is not just about writing code—it’s about automating real-world tasks and making systems efficient. 🎯 Next step: Applying these concepts to build real-world automation scripts for system monitoring and DevOps tasks. 🔗 GitHub Repository: Check out my scripts and projects here:https://lnkd.in/ggNUmtAF Consistency is key 🔑 — learning something new every day! #ShellScripting #Linux #DevOps #Automation #LearningJourney #CloudComputing
To view or add a comment, sign in
-
-
Over the past few days, I’ve moved beyond basic Linux commands and started building practical, system-level automation. Here’s a snapshot of what I’ve been working on: • Designed log analysis scripts to parse real .log files and extract meaningful insights • Implemented severity classification (OK / WARNING / CRITICAL) based on thresholds • Learned to use exit codes as machine-readable signals — a core concept in CI/CD pipelines • Built structured reporting systems with timestamps and summarized outputs • Developed multi-stage scripts simulating real-world pipeline flows • Transitioned from static scripts to dynamic ones using arguments ($1, $@) • Implemented multi-service monitoring using system-level tools like systemctl • Gained clarity on how Linux manages services via systemd and how to interact with them programmatically What stands out the most is the shift in thinking: From writing commands → to designing systems that can analyze, decide, and signal outcomes automatically This is no longer just scripting — it’s the foundation of real DevOps workflows. #Linux #DevOps #Automation #SystemAdministration #LearningInPublic
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development