Day 19 of learning and practicing DevOps 🔁 Today all about building actual automation scripts Worked on: • Log rotation script to compress and clean old logs • Backup script to create timestamped archives • Scheduling jobs using crontab • Combining everything into a maintenance script Important part: Understanding why log rotation and backups matter — without them, logs can fill up disk space and break systems. learning today --> automation + scheduling Instead of manually managing logs and backups, scripts + cron can handle everything in the background. This one is actually used in production environments. Here are my notes: https://lnkd.in/gwQUKK8b 📍 #DevOps #Linux #ShellScripting #Automation #Crontab #LearningInPublic #90DaysOfDevOps #TrainWithShubham
Log Rotation and Backup Automation with Crontab
More Relevant Posts
-
Day 19 of #90DaysOfDevOps 💻🔥 Today I built my first real-world automation scripts using Shell Scripting. ✔ Created a log rotation script (cleanup + compression) ✔ Built a server backup script using .tar.gz ✔ Automated tasks using crontab scheduling 💡 Biggest learning: Automation isn’t just writing scripts — it’s about making systems run without manual effort. ⚡ Real-world DevOps use: These concepts are used in log management, server backups, and scheduled maintenance in production systems. From learning → to building → to automating 🚀 #DevOps #Linux #ShellScripting #Automation #Crontab #90DaysOfDevOps
To view or add a comment, sign in
-
#Day10/90 of my DevOps Journey: Mastering Linux File Permissions! Today was all about the power of the command line—from creating and manipulating files with touch, echo, and cat to managing data flow with redirection operators (> vs >>). I dived deep into File Permissions, learning how to transform a simple text file into an executable script using chmod +x and the crucial "Shebang" (#!/bin/bash). Understanding the diffrence between overwriting and appending data is a small step for a dev, but a giant leap for script automation and system safety! https://lnkd.in/eZihHNBU
To view or add a comment, sign in
-
Day 17 of #90DaysOfDevOps 💻🔥 Today I dived deeper into Shell Scripting and learned how to make scripts more powerful and practical. ✔ Practiced for & while loops ✔ Worked with command-line arguments ($1, $#, $@) ✔ Built small scripts to automate tasks ✔ Learned basic error handling 💡 Biggest learning: Using arguments makes scripts reusable and dynamic — not just static commands. ⚡ Real-world DevOps use: Shell scripts are used daily for automation, deployments, monitoring, and backups. Consistency is building confidence 🚀 #DevOps #Linux #ShellScripting #Automation #90DaysOfDevOps
To view or add a comment, sign in
-
🚀 Day 19 of #90DaysOfDevOps journey with Shubham Londhe Today, I worked on a practical DevOps-style project focused on automation and system maintenance using Bash scripting. Instead of manually managing logs and backups, I built a system that handles everything automatically. 🔧 What I built: 📁 Log Rotation Script - Compresses logs older than 7 days - Deletes archives older than 30 days 💾 Backup Script - Creates timestamped backups - Verifies backup success using size output - Maintains a 14-day retention policy ⏱ Crontab Automation - Log rotation runs daily - Backups run weekly - Health checks run every 5 minutes 🧩 Maintenance Wrapper Script - Combines all tasks into one workflow - Logs everything for easier debugging 📚 Key Learnings: - Importance of validation to avoid script failures - Using "find -mtime" for automated cleanup - Redirecting logs ("2>&1") for better troubleshooting - Understanding the power of cron jobs in real-world automation This project gave me a deeper understanding of how real systems handle logs, backups, and reliability without manual effort. Step by step, I’m becoming more confident in Linux, Bash, and DevOps fundamentals 💪 #90DaysOfDevOps #DevOpsKaJosh #Linux #BashScripting #Automation #Crontab #LearningJourney #TrainWithShubham
To view or add a comment, sign in
-
Bash Scripting for DevOps — Part 10/? Till now, my scripts worked. But everything inside them was hardcoded. So every time I wanted to run it for a different environment, I had to go and change the script. That didn’t feel right. In real DevOps workflows, we don’t change the script. We change the environment. For example: ENV=staging ./deploy.sh Inside the script: echo "Deploying to $ENV environment" Now the same script works for dev, staging, and prod without changing the code. Just by changing the input. Small change in approach, but this is what makes scripts flexible and reusable. And this is used everywhere — CI/CD, Docker, Kubernetes. #DevOps #BashScripting #Linux #Automation #DevOpsJourney #LearningInPublic
To view or add a comment, sign in
-
-
Day 20 of learning and practicing DevOps 🔁 Worked on scripting project — building a log analyzer and report generator Worked on: • Reading and validating log file input • Counting errors (ERROR, Failed) using grep • Extracting critical events with line numbers • Finding top 5 recurring errors using awk, sort, uniq • Generating a structured report file • Archiving processed logs automatically Important part: Instead of manually reading logs, I built a script that analyzes everything and gives a summary in seconds. Learning today--> logs tell the story turning raw logs into useful insights. Here are my notes: https://lnkd.in/ga8xUT6U 📍 #DevOps #Linux #ShellScripting #Automation #LogAnalysis #LearningInPublic #90DaysOfDevOps #TrainWithShubham
To view or add a comment, sign in
-
DevOps Zero to Job-Ready – Day 13/180 | Functions & Arguments in Bash As scripts grow, repeating the same logic becomes hard to manage. Functions help organize code and reuse logic in one place. Arguments (`$1`, `$2`) make scripts flexible and reusable. Instead of repeating commands, define once and call when needed. More structured DevOps notes and scenarios available on www.engidock.com Next: Real script — backup automation #DevOps #Linux #Automation #EngiDock
To view or add a comment, sign in
-
Over the past few days, I’ve moved beyond basic Linux commands and started building practical, system-level automation. Here’s a snapshot of what I’ve been working on: • Designed log analysis scripts to parse real .log files and extract meaningful insights • Implemented severity classification (OK / WARNING / CRITICAL) based on thresholds • Learned to use exit codes as machine-readable signals — a core concept in CI/CD pipelines • Built structured reporting systems with timestamps and summarized outputs • Developed multi-stage scripts simulating real-world pipeline flows • Transitioned from static scripts to dynamic ones using arguments ($1, $@) • Implemented multi-service monitoring using system-level tools like systemctl • Gained clarity on how Linux manages services via systemd and how to interact with them programmatically What stands out the most is the shift in thinking: From writing commands → to designing systems that can analyze, decide, and signal outcomes automatically This is no longer just scripting — it’s the foundation of real DevOps workflows. #Linux #DevOps #Automation #SystemAdministration #LearningInPublic
To view or add a comment, sign in
-
Bash Scripting for DevOps — Part 11/? Till now, I was passing values using environment variables. That worked. But I realized something. Sometimes, I don’t want to set variables separately.I just want to pass values directly when running the script. That’s where arguments come in. In Bash, we can pass values like this: ./deploy.sh staging Inside the script, we can access it using: echo "Deploying to $1 environment" Here, $1 means the first argument. So if I run: ./deploy.sh prod It becomes: Deploying to prod environment This makes scripts much more flexible. Instead of editing the script or setting variables, I can just pass what I need at runtime. This is used a lot in real DevOps workflows: • passing environment names • passing versions or tags • controlling script behavior dynamically Small change. But now the script feels more like a real tool, not just a fixed set of commands. #DevOps #BashScripting #Linux #Automation #DevOpsJourney #LearningInPublic
To view or add a comment, sign in
-
#Day_9 – Mastering Shell Scripting Basics (Linux & DevOps) Today, I went deeper into Shell Scripting, and now I can build more practical and useful scripts. 💡 What I learned today (in very simple terms): 🔹 Conditional Logic (Advanced) if-elif-else – handle multiple conditions case – cleaner way to handle options Makes scripts more dynamic 🔹 Arguments in Scripts $1, $2 – take input from command line $# – number of arguments $@ – all arguments Helps create flexible scripts 🔹 Scheduling with Cron Jobs crontab -e – schedule tasks Run scripts automatically at fixed time Very useful in automation 🔹 Logging & Debugging Store output in log files Use set -x for debugging Track errors easily 🔥 What I realized today: Shell scripting is a powerful automation tool Scheduling tasks saves a lot of manual effort Real DevOps work depends on automation + monitoring Excited to move towards advanced DevOps tools next 🚀 Let’s keep learning and growing 💪 #Linux #DevOps #ShellScripting #Day9 #LearningInPublic #ITSkills #CareerGrowth
To view or add a comment, sign in
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development