Worked on a Cloud File Upload System using Python and Flask. The idea was simple—upload and access files securely through a web interface. Set it up inside a virtual machine and handled basic file storage, upload, and retrieval features. This project gave me hands-on experience with backend development, working in a Linux environment, and understanding how file handling works in real applications. Still learning, but this was a good practical step. #Python #Flask #CloudComputing #BackendDevelopment
Cloud File Upload System with Python and Flask
More Relevant Posts
-
💡 Starting Python on Linux? Here’s a Quick Cheat Sheet If you're beginning your journey with Python in a Linux environment, these basic commands are essential 👇 🔹 Run a Python file python3 file.py 🔹 Check Python version python3 --version 🔹 Make a script executable chmod +x file.py 🔹 Run the script directly ./file.py These may look like small steps, but they form the foundation of working with Python in real-world environments—especially in Cloud and DevOps roles. I’m currently learning and transitioning into Cloud & DevOps, and sharing these small but important learnings along the way. 📌 Save this post if you’re just getting started—it might come in handy. #Python #Linux #DevOps #CloudComputing #PythonForBeginners #LinuxCommands #TechLearning #CareerTransition
To view or add a comment, sign in
-
-
🧠 Schedule Your Python Scripts to Run Automatically — No IT Team Needed You’ve written the script. Now what — do you run it manually every Monday? No. You schedule it. 🔹 On Windows (Task Scheduler) Create a simple .bat file: python C:/scripts/weekly_report.py Schedule it once → it runs automatically every week. 🔹 On Mac/Linux (cron job) 0 8 * * 1 /usr/bin/python3 /scripts/weekly_report.py This runs every Monday at 8 AM. 🔹 On Cloud Use tools like: 👉 GitHub Actions 👉 AWS Lambda Even the free tier is often enough for basic automation. 💡 The real shift Automation isn’t just writing the script. It’s making the script run itself. That’s where: 👉 time savings compound 👉 manual work disappears 👉 processes become reliable The best scripts are the ones you never have to run manually again. 👉 What’s one script you could schedule today? #ActuaryWhoCodes #PythonForActuaries #Automation #Productivity #DataAnalytics #Analytics
To view or add a comment, sign in
-
-
Writing the script is step one — scheduling it is where the real value comes in. #PythonForActuaries #Automation #Productivity #DataAnalytics #Analytics
🧠 Schedule Your Python Scripts to Run Automatically — No IT Team Needed You’ve written the script. Now what — do you run it manually every Monday? No. You schedule it. 🔹 On Windows (Task Scheduler) Create a simple .bat file: python C:/scripts/weekly_report.py Schedule it once → it runs automatically every week. 🔹 On Mac/Linux (cron job) 0 8 * * 1 /usr/bin/python3 /scripts/weekly_report.py This runs every Monday at 8 AM. 🔹 On Cloud Use tools like: 👉 GitHub Actions 👉 AWS Lambda Even the free tier is often enough for basic automation. 💡 The real shift Automation isn’t just writing the script. It’s making the script run itself. That’s where: 👉 time savings compound 👉 manual work disappears 👉 processes become reliable The best scripts are the ones you never have to run manually again. 👉 What’s one script you could schedule today? #ActuaryWhoCodes #PythonForActuaries #Automation #Productivity #DataAnalytics #Analytics
To view or add a comment, sign in
-
-
Hyperparameter Optimization Machine Learning using dragonfly #machinelearning #datascience #hyperparameteroptimization #dragonfly Dragonfly is an open source python library for scalable Bayesian optimisation. Bayesian optimisation is used for optimising black-box functions whose evaluations are usually expensive. Beyond vanilla optimisation techniques, Dragonfly provides an array of tools to scale up Bayesian optimisation to expensive large scale problems. These include features/functionality that are especially suited for high dimensional optimisation (optimising for a large number of variables), parallel evaluations in synchronous or asynchronous settings (conducting multiple evaluations in parallel), multi-fidelity optimisation (using cheap approximations to speed up the optimisation process), and multi-objective optimisation (optimising multiple functions simultaneously). Dragonfly is compatible with Python2 (>= 2.7) and Python3 (>= 3.5) and has been tested on Linux, macOS, and Windows platforms. https://lnkd.in/gPQUfsR9
To view or add a comment, sign in
-
FastMCP, my favorite package for building MCP servers in Python, now has built-in support for OpenTelemetry: https://lnkd.in/g_6j-bbM Gwyneth Peña-Siguenza verified it works in our sample that exports to either Aspire, Azure App Insights, or Pydantic Logfire: https://lnkd.in/gaJ3pET5 Try it out in your MCP servers - make sure you upgrade to FastMCP v3 first though!
To view or add a comment, sign in
-
-
🐍🐋 Python Docker Tutorials — Docker is a containerization tool used for spinning up isolated, reproducible application environments. This page lists all of our #Python Docker tutorials. https://lnkd.in/gY9nhPX
To view or add a comment, sign in
-
Today marked the point where a Flask app that runs became one that runs correctly: replacing the development server with Gunicorn, introducing worker processes, and moving runtime decisions into environment variables. What stood out wasn’t the commands themselves, but how much clearer the boundaries became; what belongs in code, what belongs in configuration, and what belongs to the environment. I’ve realized that these transitions matter more than features at this stage. Understanding them makes everything else easier to reason about later. Next session: tightening up how runtime configuration is persisted. #SoftwareDevelopment #Python #Flask #AWS #Linux #Gunicorn
To view or add a comment, sign in
-
I broke my laptop's Python environment 3 times in one month. Different projects needed different versions. One pip install would quietly destroy another project. Then I learned Docker — and everything changed. Here's what Docker actually does (no jargon): → It wraps your app + its dependencies into a box called a container → That box runs the same on your laptop, your teammate's Mac, and a Linux server → You stop saying "it works on my machine" — because it works everywhere My first Dockerfile was 5 lines: ``` FROM python:3.11 WORKDIR /app COPY . . RUN pip install -r requirements.txt CMD ["python", "app.py"] ``` That's it. No more environment disasters. I'm a CS student learning DevOps in public — this was my week 1 win. Have you had your environment broken by dependency conflicts? How did you fix it? #Docker #DevOps #LearnInPublic #CS #BackendDev
To view or add a comment, sign in
-
-
Moving from Manual to Automated: My Python File Organizer I’ve been spending my time lately diving deep into Python fundamentals, and I just finished a project that perfectly bridges the gap between basic syntax and functional automation: a File Organizer script. The goal was simple: take a cluttered directory and instantly sort files into category-specific folders (Images, Docs, Media, etc.) based on their extensions. What I focused on in this build: Scalable Data Structures: Instead of a simple key-value pair, I implemented the 'extension_map' as a dictionary of lists. This makes the script incredibly easy to maintain and scale as I add more file types. Modern Path Handling: I used 'pathlib' for object-oriented path manipulation. It’s cleaner, more readable, and ensures the script works across Windows, macOS, and Linux. The DevOps Mindset: Beyond just writing the code, I focused on "idempotency" using .mkdir(exist_ok=True) so the script can run repeatedly without errors, and implementing error handling to manage files that are currently in use. This project was a great exercise in nested iteration and dictionary manipulation. It’s a small step, but these are the building blocks for the larger automation and orchestration tasks I’m working toward in Cloud DevOps. Next up: Adding logging to track file movements and perhaps setting this up as a scheduled task! #Python #DevOps #Automation #CloudEngineering #CodingJourney #PythonCrashCourse #SoftwareDevelopment
To view or add a comment, sign in
-
The recently published Containerlab sFlow-RT Development Environment provides example Javascript and Python scripts based on the Writing Applications guide. Runs on any ARM / x86 system with Docker installed. Provides a leaf / spine topology of switches running the FRRouting and Host sFlow services used in NVIDIA Cumulus Linux, SONiC, VyOS etc. to provide a realistic source of sFlow telemetry. https://lnkd.in/gi-jziQY
To view or add a comment, sign in
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
Good