Hey there! 😄 If you’ve ever used Mayapy (Maya’s embedded Python) or Hython (Houdini’s embedded Python) to run operations in scenes in batch mode, you know how frustratingly slow it can be to repeatedly initialize the interpreter. Whether you're writing unit tests, running automation, or building standalone tools that load multiple scenes, each startup adds unnecessary overhead. To solve this pain point, I built something that might help you: a TCP-based hot-reload server that keeps a persistent Mayapy or Hython process running in the background. Instead of launching the interpreter from scratch for every script execution, this service allows you to: ✨ Keep the DCC process alive — no more repeated startup time 📡 Connect over TCP and send Python snippets dynamically 🐍 Execute code in the live interpreter context, inside the currently loaded scene 🔄 Keep state between calls (e.g., imported modules, loaded scenes, global variables) 🔌 Close or reuse the connection depending on your needs Right now it supports both Mayapy and Hython, and the architecture makes it easy to extend to other embedded Python environments too. If you’re building automation tooling, test suites, or interactive Python-driven pipelines for Maya/Houdini, this might save you a lot of time. Check it out: 👉 https://lnkd.in/gGcUy7Xu Let me know what you think — feedback and contributions are welcome! 🙌 #Maya #Houdini #Python #Tooling #DCC #Development #OpenSource
Maya/Houdini Python Hot-Reload Server for Faster Automation
More Relevant Posts
-
Automated Meme Generator with Python I recently built a Python-based Auto Meme Generator that automatically creates and saves memes using an online meme API. The script fetches meme templates, generates random captions, and creates a new meme every few minutes without any manual work. 🔧 Tech Stack Python Requests (API communication) Memegen API Pillow (PIL) for image handling Datetime & Automation Logic ⚙️ Key Features ✅ Automatically generates memes using different templates ✅ Random caption generation for fun programming-related memes ✅ Saves memes locally with timestamped filenames ✅ Optional image preview using Pillow ✅ Runs continuously and creates a new meme every 5 minutes ✅ Supports random mode and custom template mode 💡 Why I Built This This project was a fun way to practice: Working with REST APIs Automating tasks with Python Creating small but creative automation tools It’s a simple project, but it demonstrates how APIs + automation can be used to build entertaining tools with just a few lines of Python. Always fun when code writes memes about coding itself 😄 . GitHub link: https://lnkd.in/g9DywPTq . #Python #Automation #API #Programming #PythonProjects #DeveloperTools #CodingFun #Memes #SoftwareDevelopment #BuildInPublic
To view or add a comment, sign in
-
IO Ninja and Python Can Jam Together. As you know, IO Ninja excels as a UI debugger for serial, network, USB, and all other forms of communication. It offers a slick, polished user interface, a beautiful and lightning-fast logging engine, a sophisticated hex packet editor with packet templates, regex-based data markup, and many other powerful features. Read more via Tibbo: https://lnkd.in/dxyzDyft
To view or add a comment, sign in
-
🚀 Unlocking Smarter Testing Workflows for Embedded Software! Proud to share this insightful article by my colleague Romain Andrieux on how Scade One models can be tested using Python and modern testing frameworks like pytest. 👇 🔗 https://lnkd.in/eFS3NAKg In this piece, they walk through how to leverage PyScadeOne, the Python bridge to Scade One, to integrate models into the Python ecosystem — enabling: ✔️ Exporting Scade One models as Python-callable functions ✔️ Writing and running automated tests with pytest ✔️ Using Python tools like NumPy, SciPy, and Jupyter Notebooks for deeper analysis ✔️ Bringing models into modern CI/CD pipelines This approach truly bridges model-based design with flexible, scalable testing workflows. 👏 A great read for anyone working with model-based development and automated testing! #modelbaseddevelopment #python #testing #pytest #embeddedsoftware #Ansys #ScadeOne
To view or add a comment, sign in
-
3 Performance Mistakes Python Developers Make in Production Your code works locally. It passes tests. It even gets deployed. But in production? It slows down. Here are 3 common mistakes I keep seeing: 1. Using a List Instead of a Set for Lookups if x in my_list: Lists search one by one → O(n) If lookup is frequent, use: my_set = set(my_list) if x in my_set: Sets use hashing → O(1) average time Small change. Massive impact at scale. 2. Ignoring Time Complexity Nested loops feel harmless… Until data grows 100x. Quadratic logic in small datasets becomes a production bottleneck. If you don’t know the Big-O of your solution, you’re coding blind. 3. Ignoring Memory Usage Creating unnecessary copies: new_list = old_list[:] Loading huge datasets fully into memory instead of streaming. Using lists where generators would work. Performance isn’t just speed — it’s also memory efficiency. Real Engineering Insight: Production performance problems rarely come from “bad Python.” They come from weak algorithmic thinking. Code that works is beginner level. Code that scales is professional level. Which performance mistake did you learn the hard way? #Python #Performance #SoftwareEngineering #DSA #Programming #Developers #CleanCode
To view or add a comment, sign in
-
Scaling Python testing isn’t just about adding more tests—it’s about keeping feedback loops tight as the codebase grows. In a large production environment with 2.5M+ lines of Python and 10,000+ tests, three metrics quickly become the pressure points: test execution time, reliability (flaky vs. consistent results), and coverage (signal quality). A practical workflow pairs Pytest + coverage reporting with CI gating (PR checks + required thresholds), treating tests as both a quality net and living documentation. When test runs started creeping past 30 minutes, the CI pipeline was optimized with a few high-leverage strategies: 1) parallelism on a single machine via pytest-xdist, and at a bigger scale, splitting across multiple runners using pytest-split—including duration-based balancing so slow tests don’t bottleneck one runner; 2) caching to cut dependency install time (pip cache keyed by requirements hash), plus faster installers like uv and prebuilt Docker images for heavy non-Python deps; 3) skipping unnecessary compute, e.g., only running certain jobs when Python files change, running linters only on touched files, and measuring coverage only for changed paths on PRs (then running full coverage on main); and 4) modern runners, including autoscaled self-hosted runners on Kubernetes/EC2 to improve price/performance. The results were tangible: the pipeline was brought down to <15 minutes, while coverage improved (e.g., moving from ~85% to ~95% over a year) without sacrificing PR safety. The operational reality also showed up: parallelization can surface flaky tests and shared-state conflicts, so retries, reporting to code owners, and quarantining blockers become part of keeping CI reliable as the suite grows. #Python #Testing #Pytest #ContinuousIntegration #CI #TestCoverage #DeveloperExperience
To view or add a comment, sign in
-
Built Zetten - a Rust-powered task runner for Python backends. I was tired of glue code (venvs, env vars, duplicated CI scripts), so I built a tool to make running and orchestrating tasks boring again. One cool surprise: standardized task definitions make AI coding much more reliable. No guessing - it just reads the config and runs correctly. I wrote more about the journey, tradeoffs, and lessons on Substack: https://lnkd.in/gitWHmfJ If you’re into Rust, Python, or developer tooling, I’d love your feedback. Repo: https://lnkd.in/gTrzzkn4 #Rust #Python #OpenSource #BuildInPublic #DevTools
To view or add a comment, sign in
-
🐍 Python Function Naming Rules — Write Professional Code ⚡ Function names should be clear, readable, and follow Python standards 👇 ✅ Basic Rules ✔️ Must start with a letter or underscore _ ✔️ Cannot start with a number ❌ ✔️ Can contain letters, numbers, underscores ✔️ No spaces allowed ✔️ Case-sensitive (getData ≠ getdata) ✅ Valid Function Names def greet_user(): pass def calculate_total(): pass def _private_function(): pass ❌ Invalid Function Names def 1greet(): # Cannot start with number pass def greet-user(): # Hyphen not allowed pass def greet user(): # Space not allowed pass 💡 Best Practice (PEP 8 Style) ✔️ Use lowercase_with_underscores (snake_case) ✔️ Use verbs — functions perform actions ✔️ Keep names meaningful def get_user_data(): pass def send_email(): pass def calculate_salary(): pass 🔥 Pro Tip: Good function names explain what the function does — no comments needed 👍 🚀 Clean naming = Clean code = Professional programmer 💻 #Python #Coding #Programming #LearnToCode #Developer
To view or add a comment, sign in
-
Singleton Pattern in Python — Simple Concept, Powerful Impact In production systems, controlling object creation isn’t just good design — it’s essential. One of the most practical creational patterns for this is the Singleton: ensuring a class has exactly one instance with a global access point. But here’s the catch In Python, implementing Singleton correctly (thread-safe, maintainable, production-ready) is NOT as trivial as many examples suggest. Where Singleton truly shines in real systems: ✅ Application configuration managers ✅ Database connection controllers ✅ Centralized logging systems ✅ Caching layers ✅ Feature flag services ✅ Metrics collectors Production Tip: The most robust Python implementation uses a thread-safe metaclass, not naive global variables or basic __new__ hacks. Even more Pythonic insight: Modules themselves behave like singletons due to import caching — often the simplest and best solution. But remember: Singleton introduces global state. Overuse can hurt testability and flexibility. Modern architectures often prefer dependency injection unless a true single instance is required. Design patterns aren’t about following rules — they’re about making intentional trade-offs. How do you manage shared resources in your Python applications — Singleton, DI, or something else? Read More : https://lnkd.in/gkj7hxPj #Python #SoftwareEngineering #DesignPatterns #Programming #PythonDeveloper #Coding #CleanCode #Architecture #BackendDevelopment #SystemDesign #Tech #Developers #ProgrammingLife #SoftwareDevelopment #ComputerScience #PythonProgramming #DevCommunity #TechLeadership #CodeQuality #Engineering
To view or add a comment, sign in
-
-
A deep dive for architects on configuring Python's logging system to map third-party module output (like httpx) into a custom application namespace hierarchy, ensuring centralized control.
To view or add a comment, sign in
-
Tkinter Tutorial: Building a Simple Interactive Temperature Converter Ever found yourself juggling Celsius and Fahrenheit, or Kelvin and Rankine? Converting temperatures can be a daily annoyance, especially when dealing with international standards or scientific calculations. Wouldn't it be great to have a quick, easy-to-use tool right at your fingertips to handle these conversions? This tutorial will guide you through building precisely that: a simple, interactive temperature converter using Tkinter, Python's built-in GUI library....
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
Gabriel Valderramos nice work! How do you use it yourself ? That’s basically one mayapy process that has one specific scene loaded and you can work with that scene sending commands via port ? How is that different from Maya UI where you can open a port to receive commands and send commands right from IDE? If I run mayapy and open port with cmds, it will be the same? Just trying to see what is the advantage of your approach VS mayapy running in terminal with some opened port to receive commands