Working with files in Bash comes down to 3 essential skills: 📖 Reading Files Use a while loop to process a file line-by-line (best for scripts), or cat for quick viewing or piping into other commands. ✍️ Writing Files - > → creates or overwrites a file - >> → appends to an existing file without deleting content 🔐 Checksums (File Integrity) Commands like md5sum or sha256sum generate a unique “fingerprint” of a file. If two files have the same checksum → they’re identical. If not → something changed. 👉 Why this matters: These basics power automation, data processing, and security checks in real-world scripts. #Bash #Linux #DevOps #Programming #Automatio
Bash File Handling: Read, Write, Checksums
More Relevant Posts
-
Writing Files in Bash✍️ Reading files is useful🧐 But writing them is where things get interesting. In Bash, you don’t “open and save” files. You redirect output into them. echo "hello world" > output.txt That single > means: overwrite the file If you want to keep existing data: echo "new line" >> output.txt That’s the append operator. This is one of those small concepts that quietly powers everything: • logs • automation scripts • system reports Once you get used to this, text files become your database. #Bash #Linux #Terminal #DevOps #Programming CoderCo
To view or add a comment, sign in
-
Day 1 of learning Bash. Built a system info reporter from scratch. It pulls: → Hostname, OS, kernel version → Uptime → Disk usage \(df -h\) → RAM usage \(free -m\) → Top 5 CPU processes \(ps aux --sort=-%cpu\) One script. One command. Readable output. Four things that actually stuck: ① $? isn't optional every command returns an exit code. Ignore it and you're scripting blind. ② String manipulation is built-in $\{var/old/new\}, $\{#array\[@\]\}, $\{var##\*/\}. No need to pipe everything through sed for simple ops. ③ 2\>/dev/null isn't lazy it's intentional. Suppress errors users don't need to see; log them where you do. ④ printf over echo echo is for quick checks. printf gives you format control: columns, padding, consistent output. Code is on GitHub. Rough, but working. Committing daily for 4 days. If you're learning Bash or Linux, drop a comment — happy to share notes. #Bash #Linux #DevOps #ShellScripting #LearningInPublic
To view or add a comment, sign in
-
At some point, scripts stop being about commands… and start being about data. One of the most useful patterns in Bash is reading files line by line. while IFS= read -r line That single pattern lets you: • process logs • parse configs • handle real input You’re no longer just running commands… you’re working with data. #Bash #Linux #Terminal #DevOps #Programming CoderCo
To view or add a comment, sign in
-
One glance, instant truth about a backup's footprint. Here's a one-liner that shows name, size, and mtime in one go 🔥 `stat -c '%n %s bytes %y' backup.tar.gz` In -c, format string. %n = name, %s = size in bytes, %y = last modification time. You're auditing a nightly backup; you need to confirm it's the right file, the size matches, and the timestamp is fresh. This prints a single line you can paste into notes or tickets ⚡ The terminal doesn't bluff: exact, fast, repeatable checks. Try it in your backup workflow and drop your output in the comments. #linux #terminal #oneliner #filesystem #stat #devops #sysadmin #programming #softwareengineering #opensource #productivity #scripting #backup #datamanagement #buildinpublic
To view or add a comment, sign in
-
-
Have you ever found yourself typing the same sequence of terminal commands day after day? If so, you need Shell Scripting. A shell script is simply a text file containing a sequence of commands for a Unix/Linux shell (like Bash or Zsh) to execute. Instead of you running commands one by one, the shell reads the file and runs them automatically. Think of it like writing a recipe: you list the ingredients (commands) and the steps (logic like loops or if/then statements), and the shell (the chef) follows the instructions precisely to produce the final outcome. Why should you care? 🚀 Efficiency: It turns a 10-step manual process into a single command. ✅ Consistency: Automation eliminates human error, ensuring tasks are performed exactly the same way every time. 🛠️ Power: It allows you to chain small, specialized tools together to solve complex system administration problems (backups, monitoring, deployments). In short: Shell scripting is about teaching your computer to do the boring stuff, so you don't have to. #ShellScripting #Automation #Linux #Bash #DevOps #SystemAdministration #Programming
To view or add a comment, sign in
-
-
Master the Linux Command Line: The Power of Grep, Sed, and Awk. 🚀 If you work with Linux, you know that data manipulation is where the real magic happens. Understanding the "Big Three" can turn hours of manual work into seconds of automation: Grep: Your search engine for the terminal. Perfect for finding patterns and filtering through mountains of logs. Sed (Stream Editor): The master of find-and-replace. Ideal for transforming text on the fly without even opening the file. Awk: A full-blown programming language for data processing. If you need to extract columns or perform calculations, Awk is your best friend. Stop clicking and start scripting. Which one do you use the most in your workflow? 💻 #Linux #DevOps #SystemAdministration #CodingTips #Grep #Sed #Awk
To view or add a comment, sign in
-
-
Day 29 of 30 Days of Learning Linux with Data Engineering Community Today, I focused on Error Handling and Debugging in Bash, and I gained a clearer understanding of how important it is when writing reliable scripts.I also learned that without proper error handling, a Bash script can: 1.Continue executing after a failure, which may lead to data corruption 2.Overwrite critical files unintentionally 3. Fail silently without any visible indication Key takeaway: Good Bash scripting goes beyond writing commands. It is about: 1. Detecting failures early 2. Handling errors effectively 3. Preventing silent issues 4. Ensuring scripts are predictable, safe, and reliable #DataEngineeringCommunity #Linux
To view or add a comment, sign in
-
-
Need a fast map of users to UIDs on a host? This one-liner unlocks it ⚡ Command: `awk -F: '{print $1, $3}' /etc/passwd` - F: sets the field separator to colon, splitting lines into fields - {print $1, $3}: print field 1 (username) and field 3 (UID) with a space - /etc/passwd: the system user database - In /etc/passwd, 1 is username, 3 is UID Real use case: you're auditing a fleet of Linux boxes. You run the command once and get a clean list of usernames and UIDs. It helps spot nonstandard UIDs or accounts without proper shells. If you want CSV, append `> users.csv` Why it matters: the terminal is a fast, repeatable identity map for ops. Scriptable, observable, and hard to beat. 🐧 Run it right now. Tell me what you find. #linux #terminal #bash #commandline #devops #sysadmin #opensource #productivity #programming #softwareengineering #linuxadmin #oneliner #textprocessing #audit
To view or add a comment, sign in
-
-
Day 30/41 — I learned how to repeat tasks automatically using loops in shell scripting. Yesterday I added decision-making using if-else. Today I explored how to run commands multiple times using loops. 👉 1. for loop Used to repeat a task for a set of values Example: for i in 1 2 3 do echo $i done 👉 Output: 1 2 3 👉 Loop through files: for file in *.txt do echo $file done 👉 Lists all .txt files 👉 2. while loop Runs as long as a condition is true Example: count=1 while [ $count -le 3 ] do echo $count count=$((count+1)) done 💡 What I noticed: Instead of running the same command again and again, loops can automate repetitive tasks easily. 💡 Example use case: • Process multiple files • Run scripts in batches • Automate repetitive operations This makes scripting much more powerful. Tomorrow I’ll explore functions in shell scripting. If you use Linux — do you use for loops or while loops more often?
To view or add a comment, sign in
-
-
Same AI. Different environment. Different developer. On Unix: grep / find / pipes On Windows: PowerShell / object pipelines You’re not just learning “coding” You’re absorbing whatever environment the AI operates in.
To view or add a comment, sign in
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
the checksum part misses the real point. you're not just checking if files match, you're catching corruption and tampering in transit. sha256 matters because md5 collisions are trivial to fake now. if you're validating integrity in production, weak hashing is a liability.