Log Analysis with Grep, Awk, and Sort in Shell Scripting

Day 5 of Shell Scripting — Log Analysis Like a Real DevOps Engineer [Writing this at 4:03 PM IST] Most people learn commands. Today I learned to think in pipelines. Day 5 was all about grep, awk, and sort — and how combining them turns raw log files into actual intelligence. What I practiced on a real app.log file: grep — filtered errors, debugs, warnings with -i (case insensitive), -n (line numbers), -c (count), -v (exclude). Found 7 ERROR entries and 4 DEBUG entries instantly. awk — extracted specific fields from those filtered lines. Timestamp + service component in one clean output. sort — sorted that output by service name using -k2 to group all database, api, auth errors together. The final command that clicked for me: grep "ERROR" app.log | awk '{print $2, $4}' | sort -k2 Three tools. One pipeline. Instant clarity on which service is failing most. This is exactly what you need when you're on-call at 2 AM and a production system is throwing errors. No GUI. Just you, the terminal, and your grep flags. Day 5 done. Pipeline thinking unlocked. 🔥 #DevOps #Linux #ShellScripting #BashScripting #100DaysOfCode #DevOpsJourney #Korelium

To view or add a comment, sign in

Explore content categories