20/30 learnings: 🚀 Embarking on the Python Automation Journey Starting with Python for Data Automation? You're not alone in finding the roadmap unclear. Many skip over the foundational aspects, but building a strong Python foundation is crucial for effective automation. 🔑 Why Python Fundamentals Matter Before diving into automation, ensure you're comfortable with: Basic syntax and data structures Functions and control flow Modules and libraries This foundational knowledge empowers you to: Write clean, efficient scripts Debug effectively Build scalable automation solutions 📘 A Resource to Guide You: github link: https://lnkd.in/gGBHc_J6 #30dayslearnings #pythonbasics
Vijaya Surya Narayan’s Post
More Relevant Posts
-
💻 New Python script has been added to repository https://lnkd.in/dJ5FrAdi What it does? Checks for all projects and it's work item types and if they are used in any projects to help with the cleanup process. Benefit Get rid of the work item types that are only in default type scheme rather having a separate scheme. #AtlassianChampion #Jira #Python
To view or add a comment, sign in
-
I’ve created and uploaded a Python repository on GitHub — perfect for beginners who want to practice Python, review core concepts, or simply understand the syntax in an easy and organized way. This repository includes well-structured examples and simple scripts that can help anyone starting their Python journey or refreshing their knowledge. 📂 GitHub Repository: https://lnkd.in/ecHVT7F2 Whether you’re learning, revising, or exploring Python, this repo can be a great starting point! Feel free to fork, explore, and contribute. 💻 #Python #Programming #GitHub #Coding #Developers #PythonForBeginners #LearnToCode
To view or add a comment, sign in
-
"Day 4 was a masterclass in efficiency! I condensed three days of focused learning into one morning block, and the results are significant: 1. DSA Pattern Confirmation: The morning block focused on confirming the Two-Pointer technique by successfully implementing solutions for three key problems: #26 (Remove Duplicates): Implemented the standard Overwrite Pointer. #27 (Remove Element): Implemented the efficient Swap-to-End technique. #80 (Remove Duplicates II): Implemented the advanced Look-Back Pointer, confirming the ability to handle complex constraints. This work locks in the core O(n) time / O(1) auxiliary space rule for array manipulation, establishing a solid foundation for future algorithmic challenges. 2. Project Pivot: Architecture and Git Flow 1. The focus immediately shifted to application architecture setup: I successfully defined, implemented, and verified the foundational User class in Python. This essential data model is the first structural component for the future Flask backend. 2. Crucially, I initiated my first professional Git feature branch (feature/oop-model). All new project architecture code is now separated from the main branch, ensuring a clean, professional, and stable development history. The project foundation is now secure, and the next step is integrating this model into the application's structure. #DSA #Python #OOP #GitFlow #SoftwareEngineering #ProjectDevelopment"
To view or add a comment, sign in
-
A few days ago, I was watching videos by Hitesh Choudhary Sir (Chai Aur Code) and ThePrimeTime talking about the possible GitHub ban mainly because of fake PRs being made on popular repositories like Express.js during Hacktoberfest. That really got me thinking... 👉 What if one day my GitHub account gets banned or something goes wrong? Years of projects, code, and experiments, all gone forever. 😶 So, I created a Python package called github-backup-supabase that: 🔹 Automatically backs up all your GitHub repositories locally 🔹 (Optionally) uploads them to Supabase Storage for safekeeping You can install it using: pip install github-backup-supabase and run your first github backup. 📦 Official PyPI page: https://lnkd.in/dTzWxkss I also uploaded a YouTube video explaining its development and how to use it, check it out if you’re curious about how it works behind the scenes! https://lnkd.in/dfEjxxTs If you install the package then drop a comment below 👇, I would love to hear your thoughts or any suggestions for improvements!
To view or add a comment, sign in
-
-
Monitoring Ubuntu VM with Custom Metrics (Python + Prometheus + Grafana + GitHub Automation) #prometheus #grafana #Observability #monitoring #githubAutomation complete github code-> https://lnkd.in/duAv_2RN Building observability doesn’t always require heavy tooling — sometimes a few lines of Python can give you deep visibility into your system! In my latest project, I built a custom Ubuntu VM monitoring solution using: Python + Prometheus Client → to generate and expose OS-level metrics Prometheus → to scrape and store the time-series data Grafana → to visualize performance trends with real-time dashboards GitHub Actions (CI/CD) → to automate deployment on every code push Custom Metrics Collected: system_cpu_usage_percent -> CPU usage % system_memory_usage_percent ->Memory usage % system_disk_usage_percent ->Disk utilization % system_uptime_seconds ->System uptime in seconds system_network_bytes_sent ->Network bytes sent service_port_up{port="22"} -> 1 if SSH port is open process_up{process="nginx"} ->1 if process is running url_up{url="https://google.com"} ->1 if site is reachable Tech Workflow 1️⃣ Python exporter exposes /metrics endpoint using prometheus_client 2️⃣ Prometheus scrapes the metrics periodically 3️⃣ Grafana dashboard visualizes trends for CPU, memory, network, uptime, and service health 4️⃣ GitHub Actions pipeline automates deployment → push → test → deploy exporter Benefits ✅ Full control over what you monitor (OS, process, URLs, ports, etc.) ✅ Lightweight — no agent or extra overhead ✅ Fully automated with GitHub CI/CD ✅ Production-ready visualizations in Grafana
To view or add a comment, sign in
-
🚀 Today I learned something every developer should know! While pushing my project to GitHub, I accidentally uploaded my entire Python virtual environment (venv) — almost 50,000+ files 😅. This caused warnings, massive uploads, and slow commits. After debugging for hours, I finally understood: 🔹 You should never push your venv folder to Git 🔹 Only your source code + requirements.txt should go in the repo 🔹 Add venv/ in .gitignore to avoid tracking unnecessary environment files The moment I removed venv from Git and added it to .gitignore, everything became super smooth: ✨ Cleaner repository ✨ Faster commits & pushes ✨ Project is easier to clone and set up for others 💡 And the best part? Anyone can recreate the environment later using: python -m venv venv pip install -r requirements.txt 🔻 Key takeaway Sometimes you don’t learn from tutorials — you learn from mistakes. And that’s perfectly fine. Every bug, warning, and error helps you grow as a developer. 📌 If you're starting with Python, Git, or VS Code: Remember — push code, not the venv. #Python #Git #GitHub #DevLife #LearningInPublic #VSCode #100DaysOfCode #DeveloperJourney #SoftwareEngineering #Debugging #VirtualEnvironment
To view or add a comment, sign in
-
Raw Python, Real Growth: Going Back to Basics 🐍💻 The past few days were all about foundations. No flashy projects, no frameworks — just raw Python, line by line, logic by logic • Write clean syntax & handle inputs/outputs • Swap variables & compare values efficiently • Perform arithmetic operations & real-world conversions • Use conditional statements to make programs think • Solve logic exercises — from leap years to vowels & number types Every time I revisit the basics, I discover something new.. Strong foundations don’t just make better code — they make better problem solvers.. Growth in coding is never linear — sometimes, the best way forward is to strengthen what’s underneath..!✨ Here’s how I refined my Python basics 🚀 👉 check out here : https://lnkd.in/gt4_QN2N KSR Datavizon #python #pythondeveloper #programming #LearningJourney #backtobasics #ProgrammingBasics
To view or add a comment, sign in
-
Exploring GitHub Copilot in the CLI Today I explored GitHub Copilot for the command line (CLI) — and it’s quite interesting to see how AI is being integrated beyond the IDE! When I tried: gh copilot suggest "write a prime number program in Java below 10" Copilot responded with a PowerShell/Bash script instead of Java code. That’s when I realized something important — the CLI version of Copilot is mainly built to assist with terminal and GitHub commands, not full programming code like the VS Code extension does. A few key takeaways: gh copilot suggest helps you write or refine shell and GitHub commands gh copilot explain clarifies what a command does Have you tried GitHub Copilot in the CLI yet? #GitHubCopilot #CLI #AI #Automation #Developers #VSCode #OpenAI #CodingProductivity
To view or add a comment, sign in
-
-
The 2025 GitHub Octoverse numbers are in, with mind-blowing stats! 🤯 🚀 180M+ developers now build on GitHub ⚡ A new developer joins GitHub every second 🥇 TypeScript is the new top language, overtaking JavaScript & Python! 🤖 80% of new developers use GitHub Copilot in their first week AI isn't just a tool anymore, Agents are here. The entire developer landscape is changing.
To view or add a comment, sign in
-
I built a small tracer from scratch in Python. It automatically creates spans for function calls (like requests.get) using: 𝗠𝗼𝗻𝗸𝗲𝘆 𝗽𝗮𝘁𝗰𝗵𝗶𝗻𝗴 – intercepting existing library functions and wrapping them with tracing logic, without modifying the original code. 𝗖𝗼𝗻𝘁𝗲𝘅𝘁 𝘃𝗮𝗿𝗶𝗮𝗯𝗹𝗲𝘀 (𝗰𝗼𝗻𝘁𝗲𝘅𝘁𝘃𝗮𝗿𝘀) – these are like thread-local storage but designed for async and concurrent programs. They let each coroutine or thread safely keep its own tracing context, so even with parallel requests, the tracer knows which span belongs to which flow. 𝗪𝗵𝗮𝘁’𝘀 𝗮 𝘀𝗽𝗮𝗻? A span represents one unit of work — for example, a function execution or an API call. Spans can be nested (parent-child), forming a tree that shows the entire flow of a request across different components. Each span is logged to a file (trace_log.jsonl), capturing parent-child relationships and timing details. Later, this data can be visualized to see how requests flow through a system — similar to what full-fledged tracing tools like OpenTelemetry do. GitHub: https://lnkd.in/gcR7XCwc
To view or add a comment, sign in
More from this author
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development