I pretty much start every Python project with a pyproject.toml. It keeps things simple: - One place for configs - Easy for VSCode to pick up - Easy to reuse in CI/CD pipelines - No environment variables to set up - No manual setup for installs and dependencies But I ran into a subtle issue while using it inside a DevContainer. Everything looked fine: - Project structured correctly - pyproject.toml in place - Virtual environment set up But imports started behaving inconsistently. What made it tricky was that nothing was obviously broken. So I went back to basics: - checking where Python was actually importing from import my_package print(my_package.__file__) That’s when it clicked. The DevContainer was picking up a previously installed version of the project instead of my current code. The fix ended up being: - installing the project in editable mode pip install -e . After that, imports consistently pointed to the working directory instead of a stale installed version. Big takeaway for me: Even with a clean pyproject.toml, how your project is installed matters just as much as how it’s defined. Lately I’ve been focusing more on making my dev environment predictable, not just “working” Curious how others handle this: Do you always install your project in editable mode during development? #DevOps #Python #Docker #VSCode #SoftwareEngineering
pyproject.toml subtlety in DevContainer
More Relevant Posts
-
Your Django API is not slow because of your database. It's slow because of serialization. A 17-field serializer on 1,000 records = 17,000 Python calls. Just to produce JSON. We hit this in a rental platform in Stockholm. 80ms → 3 seconds at 2,000 users. So we built ClaraX — Rust serialization for Django. One line. No rewrites. No Rust knowledge. Results: → 475ms → 14ms (33x) → 506ms → 10ms (50x) pip install clarax-django python manage.py clarax_doctor https://lnkd.in/dgmZgnK5
To view or add a comment, sign in
-
-
PySpector v0.1.8 is out🚀 Me and the PySpector Core Team worked really hard to deploy this version, so here's what changed: - A new vulnerability leading to arbitrary code execution via plugin bypass was patched (and its #GHSA was published) - Docs were updated and improved🫡 - We fixed a bug preventing the generation of html reports, as well as 2 other bugs preventing the --wizard and -- supply-chain flag from working properly - We expanded error messages during #AST file parsing and added a new #CLI flag to enable Python SyntaxWarnings during code scanning - And last we (finally) expanded support for Python up to the latest #Python3.14 (while before v.0.1.8, Python support stopped at #Python3.12) Thanks to all the #contributors and the awesome SecurityCert community who made this possible🫶 Repo: https://lnkd.in/d7CppftJ
To view or add a comment, sign in
-
Clean Code & Dependency Management: Mastering Python Modules, Packages, and Venvs! 🐍 As my Python projects grow in complexity, I’ve realized that writing good code is only half the battle—organizing it properly and managing dependencies is the other half. Today, I took a deep dive into the infrastructure that makes Python development scalable and professional. Here’s the breakdown of my latest learning session: 🧩 Modules & Packages: Learned how to break down monolithic code into smaller, logical Modules. Organized these modules into Packages using __init__.py, making my code reusable across different projects. No more messy, thousand-line files! 📦 pip & Dependency Management: Mastered using pip to tap into the massive ecosystem of Python libraries. Learned the importance of requirements.txt to ensure my projects are easily reproducible by other developers. 🛡️ Virtual Environments (venv): This was a "Eureka" moment! I now understand how to create isolated environments for every project. No more "dependency hell" or version conflicts. My FastAPI projects can now live happily alongside my other scripts without interfering with each other. Understanding these tools is shifting my mindset from "writing scripts" to "building software." It’s all about creating clean, maintainable, and portable applications. #Python #SoftwareEngineering #CodingJourney #BackendDevelopment #CleanCode #Venv #PythonPackages #FastAPI #ContinuousLearning #TechCommunity
To view or add a comment, sign in
-
-
🚀 New Tutorial!! Just published a step-by-step guide on integrating GitLab feature flags into a Python Flask app using the Unleash SDK so you can control feature rollouts in real time without redeploying a single line of code. Learn how to: ✅ Create and manage feature flags directly in GitLab ✅ Connect the Unleash Python SDK with zero personal access tokens ✅ Roll out features progressively from QA teams to a % of users to everyone ✅ Toggle features instantly if something goes wrong in production Check out my Blog: https://lnkd.in/eXDc73nE #DevOps #GitLab #FeatureFlags #Python #Flask #CI_CD #CloudEngineering
To view or add a comment, sign in
-
Metaclasses in Python-The Hidden Power Behind Classes Most developers know that objects are created from classes. But here’s something many don’t realize. 👉 Classes themselves are created by something called a Metaclass 🧠 What is a Metaclass? A metaclass is simply: ➡️ A class that defines how other classes are created By default, Python uses: type Yes, the same type() you use to check data types! Let’s Break It Down When you write: class MyClass: pass Python actually does this behind the scenes: MyClass = type('MyClass', (), {}) 👉 That means: type is the default metaclass It constructs your class dynamically. Why Use Metaclasses? Metaclasses are powerful but should be used carefully . They are useful when you want to: ✅ Enforce coding standards across classes ✅ Automatically modify class attributes ✅ Register classes (plugin systems) ✅ Build frameworks (like Django ORM internally) Final Thought Metaclasses are advanced Python magic . They give you control over class creation itself — something most developers never touch. But once you understand them… You start thinking like a framework developer. Have you ever used metaclasses in a real project? Or is this your first time exploring them? #Python #AdvancedPython #BackendDevelopment #Django #SoftwareEngineering #LearnToCode
To view or add a comment, sign in
-
🚀 Day 66 – Project Work | Important Python Concepts Today I focused on strengthening core Python concepts that are crucial for building scalable projects. 💻🐍 Sometimes we jump into frameworks and tools, but strong fundamentals make everything easier. 🔹 Key Python concepts I worked on: ✔️ Functions & modular coding ✔️ Classes & Object-Oriented Programming (OOP) ✔️ Exception handling (try-except) ✔️ File handling (loading models & data) ✔️ Working with JSON data (API requests/responses) 🔹 How it helped my project: 👉 Made my FastAPI code cleaner & structured 👉 Improved error handling in API 👉 Better data flow between model and backend 👉 Easier debugging and maintenance 🔹 Challenges: ⚡ Writing clean and reusable code ⚡ Handling unexpected errors properly ⚡ Structuring project files efficiently 🔹 What I learned: 💡 Strong basics = strong projects 💡 Clean code saves time later 💡 Python concepts are the backbone of ML + Backend 📌 Next Step: Refactor my project using these concepts and move closer to deployment 🚀 #Day66 #Python #ProjectWork #FastAPI #MachineLearning #Coding #LearningJourney
To view or add a comment, sign in
-
-
The flasgo website is now live. You want a fast async typed Python web framework that has security built in from the start(follows owasp 2025) requires Python 3.14 and has a small attack surface as possible with django like security primatives but is easy as using flask? Then you have come to the right place. #python #webdevelopment #web
To view or add a comment, sign in
-
Over the past few weeks, Carter Davis and I have been developing a tool that analyzes changes in Python code at the AST level. The project reads modified Python files, parses them with Tree-sitter, builds abstract syntax trees, and compares the old and new versions to detect structural similarities and differences in the code. Rather than only showing line-by-line changes, the tool currently identifies when functions have been added, deleted, or moved, making it easier to understand the overall structure of what changed in the code. To make the results easier to understand, we built a human readable summary layer that explains what changed in plain language, such as: - Added function “test” on line 4 - Added function “yFunction” on line 11 - Deleted function “anotherFunction” from line 17 - Function “ILikeMath” moved from line 13 to line 15 Throughout this project I gained a much stronger understanding of Docker and how to package and run applications in a consistent environment. I also learned more about GitHub Actions, where we created an automated workflow that compares the current version of Python files against the previous commit whenever new code is pushed. I also gained experience working with ASTs, parsing libraries, and designing algorithms to compare code semantically rather than just textually. We worked in an agile style environment, holding daily scrums and weekly reviews to discuss progress, challenges, and next steps. This project gave me a much better understanding of software engineering workflows, code analysis, and collaborative development. You can check out the project here: https://lnkd.in/gw5f_ipb
To view or add a comment, sign in
-
Type errors in Python only surface when the faulty code path actually executes at runtime. A function that receives the wrong argument type can pass an entire test suite — then fail in production on a condition nobody anticipated. mypy catches that class of error before any code runs. But many articles stop at "add annotations and run mypy." The mechanics of how it actually works stay opaque. The article linked below (on PythonCodeCrack) goes further: — The full analysis pipeline: AST parse → import resolution → type inference → contract checking, with no execution involved — How gradual typing works in practice, including what the Any type actually does to mypy's analysis downstream — A precise look at type narrowing and control flow analysis — with an interactive diagram showing how isinstance() resolves str | int into concrete types per branch — The difference between # type: ignore and cast() — and why using the wrong one silently breaks your type guarantees for all code that follows — What mypy 1.20 changed: the narrowing engine rewrite, fixed-format cache as the new default, and the experimental Ruff-based parser — How pyright and ty differ from mypy architecturally — not just in speed benchmarks, but in evaluation strategy and what that means for unannotated legacy code Written for developers who want to understand the tool, not just run it. https://lnkd.in/e838Mdu5 #Python #SoftwareEngineering #TypeHints
To view or add a comment, sign in
-
💻 Learning Update: Python for DevOps 🚀 Finally understood how to build CLI tools using argparse 🔥 Was confused for a long time, but after practicing and debugging, it finally clicked. Built a small CLI: python app.py start nginx --replicas 4 python app.py stop nginx Building CLI tools like this is how real DevOps tools are structured internally. 🔹 Difference I learned: add_subparsers() → lets you choose between different commands (start, stop, scale) add_parser() → defines each command and its arguments Next: Connecting CLI with APIs 🚀 #Python #DevOps #CLI
To view or add a comment, sign in
-
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development