AWS Serverless Health Check System (Terraform + Python) I recently built a cloud-based health monitoring system using AWS Lambda, API Gateway, S3, and Terraform to strengthen my hands-on cloud and DevOps skills. What I did: - Developed a Python Lambda function to monitor application health endpoints - Provisioned infrastructure using Terraform (IAM, API Gateway, S3) - Designed modular, reusable infrastructure as code - Implemented secure, least-privilege IAM policies - Solved dependency challenges using Terraform references instead of hardcoding Check out the full project and code here: https://lnkd.in/eYxgUzy6 #AWS #CloudComputing #DevOps #Terraform #InfrastructureAsCode #Serverless #AWSLambda #APIGateway #S3 #Python #CloudEngineering #SoftwareEngineering #TechProjects #GitHub #OpenToWork
AWS Serverless Health Check System with Terraform and Python
More Relevant Posts
-
🚀 Python on AWS – Scalable Backend Systems Built and deployed backend systems using Python (FastAPI/Django) on AWS, focusing on scalable and high-performance architectures. ☁️ AWS (EC2, Lambda, ECS, EKS, S3, RDS, DynamoDB) ⚙️ REST APIs & Microservices 🔄 Docker, CI/CD (Jenkins, GitHub Actions) 📊 Redis caching & performance optimization 🔐 IAM, security best practices & encryption 🗄️ Database design (PostgreSQL, NoSQL) Always exploring better ways to build cloud-native, distributed systems. #Python #AWS #CloudComputing #Microservices #BackendDevelopment #DevOps #SystemDesign
To view or add a comment, sign in
-
🚀 Built a complete CI/CD pipeline for a distributed microservices app on Azure DevOps! The app: a voting platform built with Python, Node.js, .NET, Redis, and PostgreSQL — 5 services running across Docker containers. What I built: ✅ Migrated the repo from GitHub to Azure DevOps ✅ Created 3 independent pipelines — vote-service, result-service, worker-service ✅ Path-based triggers so each pipeline only fires on changes to its own service ✅ Self-hosted Azure agent configured and running on an Azure VM ✅ All 3 Docker images built and pushed to Azure Container Registry (ACR) Challenges solved along the way: 🔧 Fixed agent pool misconfigurations (vmImage vs name — a classic!) 🔧 Resolved Docker socket permission issues on the self-hosted agent 🔧 Fixed BuildKit/legacy builder incompatibility in a multi-stage .NET Dockerfile Next step: Deploy to AKS using the images now sitting in ACR 🎯 #Azure #AzureDevOps #DevOps #Docker #CICD #ACR #AKS #CloudEngineering #MicrosoftAzure
To view or add a comment, sign in
-
-
🚀 How do you create and deploy a simple AWS Lambda function using Python? Day 35 / 100 of #100DaysOfCloud ✅ Today I worked on building a serverless function using AWS Lambda, focusing on execution roles and response handling. 🔹 Task Overview The goal was to create a Lambda function that returns a custom message with a proper status code using Python runtime. 🔹 Steps Performed ✅ Created a Lambda function named devops-lambda ✅ Selected Python runtime ✅ Created and attached an IAM role lambda_execution_role ✅ Wrote function code to return response: Body → "Welcome to KKE AWS Labs!" Status Code → 200 ✅ Deployed the function using AWS Console ✅ Tested the function to verify correct output 🔹 Result Successfully deployed a serverless Lambda function that returns the expected response with status code 200, confirming proper configuration and execution. 💡 Why this matters AWS Lambda enables event-driven, serverless computing, reducing infrastructure management while allowing scalable and efficient application execution. Continuing to strengthen my hands-on experience with AWS serverless services, IAM roles, and cloud automation. #AWS #DevOps #Lambda #Serverless #CloudComputing #Python #100DaysOfCloud
To view or add a comment, sign in
-
-
Cloud Tech Tip #24 — AWS CDK: Writing Cloud Infrastructure Like a Developer Terraform is great. CloudFormation works. But what if you could define your entire AWS infrastructure in Python, TypeScript, or Java? That's exactly what AWS CDK lets you do. What is AWS CDK? The AWS Cloud Development Kit is an open source framework that lets you define cloud infrastructure using real programming languages — and then synthesizes it into CloudFormation under the hood. No more YAML files that are 800 lines long. Just clean, readable, testable code. How cloud engineers use it: → Reusable constructs — package infrastructure patterns into reusable classes. Build an EKS construct once, use it across every environment. → Environment parity — deploy the exact same stack to dev, staging, and prod with environment-specific config passed in as parameters → Type safety — your IDE catches misconfigurations before they ever reach AWS → Testing — write unit tests against your infrastructure code just like application code → CI/CD integration — plug CDK synth and deploy directly into your GitHub Actions pipeline. CDK vs Terraform → CDK is ideal if your team is already writing Python or TypeScript → Terraform is better for multi-cloud environments and existing HCL workflows → Both are valid — the best tool is the one your team will actually maintain. If you're already writing Python or TypeScript day to day — CDK is worth exploring seriously. #AWS #CDK #InfrastructureAsCode #CloudEngineering #DevOps #Terraform #CloudTips
To view or add a comment, sign in
-
-
🚀 3-Tier Deployment of Python Flask Application on AWS Excited to share my latest hands-on project where I successfully deployed a Python Flask application using a 3-tier architecture on AWS. 🔹 Architecture Overview: • Presentation Layer – for traffic distribution • Application Layer – running Flask with Gunicorn & Nginx • Data Layer – for secure database management 🔹 Key Highlights: ✔️ Designed a scalable and secure VPC architecture ✔️ Implemented Auto Scaling for high availability ✔️ Configured Load Balancer for efficient traffic routing ✔️ Deployed Flask app with production-ready setup ✔️ Ensured database security using private subnets 🔹 Tools & Technologies: Python | Flask | AWS | Nginx | Gunicorn | Linux | Networking This project helped me strengthen my understanding of cloud architecture, DevOps practices, and scalable deployments. Looking forward to exploring more in Cloud & DevOps 🚀 #AWS #DevOps #CloudComputing #Python #Flask #3TierArchitecture #Learning #Projects
To view or add a comment, sign in
-
𝐏𝐫𝐨𝐠𝐫𝐚𝐦𝐦𝐚𝐭𝐢𝐜 𝐈𝐧𝐟𝐫𝐚𝐬𝐭𝐫𝐮𝐜𝐭𝐮𝐫𝐞: 𝐒3 + 𝐁𝐨𝐭𝐨3 Integrating the 𝐀𝐖𝐒 𝐏𝐲𝐭𝐡𝐨𝐧 𝐒𝐃𝐊 with my 𝐂𝐋𝐈 to automate S3 deployments and using a python script to create resources on AWS cloud. As part of my AWS Cloud Practice, I have setup my environment configuration through the CLI. Today I automated S3 bucket creation using Python. My 𝐆𝐢𝐭𝐡𝐮𝐛 𝐥𝐢𝐧𝐤 to AWS Cli + Python SDK repo: https://lnkd.in/d53mp-JN 𝐖𝐨𝐫𝐤𝐟𝐥𝐨𝐰: 1. Configure AWS CLI with IAM credentials. 2. Use Boto3 to initialize an S3 client. 3. Handle regional endpoints and constraints programmatically. "Building the cloud" is exciting, can't wait to integrate with Agentic Ai Applications. It was interesting to find out that the first service AWS ever launched was s3 and it was at the us-east-1 region, which is the only reason why you have to use LocationConstraints when you want to create resources in other regions to create endpoints. #AWSArchitecture #PythonDeveloper #InfrastructureAsCode #AWSCLI #Tech #AWS #Python
To view or add a comment, sign in
-
🚀 6 AWS Deployment Strategies every DevOps Engineer must know! After working hands-on with AWS, I have put together this visual guide covering all major deployment strategies: #AWS #DevOps #CloudComputing #AWSCertified #CloudEngineer #DevOpsEngineer #SoftwareDeployment #python
To view or add a comment, sign in
-
Infra Validation .... I have been working on an Azure container validation infrastructure using Terraform. The idea came from real enterprise DevOps challenges I have faced. The concept is simple. You hand me any container, Python, Java, .NET, doesn't matter, and this infrastructure will tell you if it works on a real cloud environment. Not just "works on my machine." What it does: 1. Spins up a secure private network on Azure with a Bastion jump server. 2. Private VM with Docker running behind it with no public IP exposed. 3. Pushes images to Azure Container Registry. 4. validate.sh script SSHes in, runs the container, checks HTTP response and stores the result in Blob Storage. 5. All authentication done via Managed Identity, no passwords, no keys sitting around. Load Balancer and autoscaling are built in. Destroyed and rebuilt it multiple times to make sure it works clean every time with Terraform. #Azure #Terraform #Docker #DevOps #CloudEngineering
To view or add a comment, sign in
-
-
❓ Why Security Must Be Built Into Cloud-Native Systems from Day One As systems move to AWS and Kubernetes, security becomes more complex — not less. When I first started working in cloud environments, I thought security was mostly about IAM roles and network policies. But in real-world backend and data platforms, security touches everything: How services authenticate with each other How secrets are stored and rotated How containers are configured and scanned How logs and telemetry are protected How least-privilege access is enforced In Kubernetes environments especially, small misconfigurations can have large impacts. For example: Overly broad IAM permissions Hardcoded secrets in environment variables Open security groups Missing role-based access control (RBAC) The shift for me was realizing this: Security is not a final review step. It’s part of application design. When building Python services running on Kubernetes in AWS, I now think about: IAM roles instead of static credentials Kubernetes secrets management strategies Network policies for service isolation Observability tools to detect abnormal behavior Infrastructure-as-code to avoid manual configuration drift The goal isn’t just to pass audits. It’s to build systems that are secure by default. Cloud-native engineering gives us powerful tools — but it also requires discipline. I’ll share insights on designing scalable backend APIs for Kubernetes environments. #CloudSecurity #Kubernetes #AWS #BackendEngineering #CloudNative #DevOps #Python #InfrastructureAsCode #PlatformEngineering
To view or add a comment, sign in
-
I finally understood how a backend actually works — not just ran the code. For the past few days, I was learning Docker, Flask, and Redis. Things kept breaking, and I had to restart multiple times. But finally it clicked. I built a simple URL shortener: 👉 Takes a long URL 👉 Generates a short code 👉 Stores it 👉 Redirects back when opened While fixing errors, I started understanding how each part connects. ⚙️ Tech I used: Flask (handles requests) Redis (stores key → URL) Docker (runs everything together) AWS EC2 (server) Still learning, but this felt like real progress. 💻 GitHub:https://lnkd.in/ga9DvTT6 #DevOps #Docker #AWS #Python #Learning
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development