dsr-filesystems 3.1.0 Released

𝗠𝗼𝘃𝗶𝗻𝗴 𝗳𝗿𝗼𝗺 𝗘𝗮𝗴𝗲𝗿 𝘁𝗼 𝗟𝗼𝗴𝗶𝗰𝗮𝗹 𝗩𝗮𝗹𝗶𝗱𝗮𝘁𝗶𝗼𝗻 𝗶𝗻 𝗠𝗟 𝗣𝗶𝗽𝗲𝗹𝗶𝗻𝗲𝘀 🚀 I’m excited to share that I've just released 𝘃𝟯.𝟭.𝟬 of 𝗱𝘀𝗿-𝗳𝗶𝗹𝗲𝘀, a library focused on high-performance file handling for ML auditing. This update was born out of a "logical epiphany" while refactoring my orchestrator. I realized that a configuration class shouldn't care if a file physically exists—it should only care if the description of that file is logically sound. 𝗪𝗵𝗮𝘁’𝘀 𝗻𝗲𝘄 𝘀𝗶𝗻𝗰𝗲 𝟮.𝟮.𝟬: 🔹𝗦𝗺𝗮𝗿𝘁 𝗧𝘆𝗽𝗲 𝗩𝗮𝗹𝗶𝗱𝗮𝘁𝗶𝗼𝗻: The 𝗙𝗶𝗹𝗲𝗧𝘆𝗽𝗲 enum now performs logical extension checks (e.g., verifying .𝗷𝘀𝗼𝗻𝗹 fits 𝗙𝗶𝗹𝗲𝗧𝘆𝗽𝗲.𝗝𝗦𝗢𝗡) without needing a filesystem connection. 🔹𝗖𝗹𝗼𝘂𝗱-𝗡𝗮𝘁𝗶𝘃𝗲 𝗣𝗮𝘁𝗵𝗶𝗻𝗴: Added native support for S3, GCS, and Azure via 𝗰𝗹𝗼𝘂𝗱𝗽𝗮𝘁𝗵𝗹𝗶𝗯. You can now pass raw URI strings directly to any handler. 🔹𝗨𝗻𝗶𝘃𝗲𝗿𝘀𝗮𝗹 𝗣𝗮𝗿𝗮𝗺𝗲𝘁𝗲𝗿 𝗙𝗶𝗹𝘁𝗲𝗿𝗶𝗻𝗴: Introduced 𝘀𝗮𝗳𝗲_𝗰𝗮𝗹𝗹=𝗧𝗿𝘂𝗲 to catch and log incompatible keyword arguments, preventing engine-level crashes in 𝗽𝘆𝗮𝗿𝗿𝗼𝘄 or 𝗳𝗮𝘀𝘁𝗽𝗮𝗿𝗾𝘂𝗲𝘁. 🔹𝗖𝗼𝗻𝗳𝗶𝗴𝘂𝗿𝗮𝘁𝗶𝗼𝗻-𝗗𝗿𝗶𝘃𝗲𝗻 𝗦𝗮𝗳𝗲𝘁𝘆: Moved to a centralized 𝗬𝗔𝗠𝗟-𝗯𝗮𝘀𝗲𝗱 𝗣𝗮𝗿𝗮𝗺𝗲𝘁𝗲𝗿 𝗥𝗲𝗴𝗶𝘀𝘁𝗿𝘆. Using a custom 𝗨𝗻𝗶𝗾𝘂𝗲𝗞𝗲𝘆𝗟𝗼𝗮𝗱𝗲𝗿, the library now enforces a "ground truth" for valid engine arguments. 🔹𝗛𝗶𝗴𝗵-𝗣𝗲𝗿𝗳𝗼𝗿𝗺𝗮𝗻𝗰𝗲 𝗖𝗮𝗰𝗵𝗶𝗻𝗴: Implemented 𝗹𝗿𝘂_𝗰𝗮𝗰𝗵𝗲 for parameter retrieval, ensuring that even with strict configuration-driven filtering, I/O overhead remains sub-millisecond. By decoupling logical state from physical environment checks, I’ve made my pipelines more portable and significantly more resilient. Check it out on PyPI: 𝗽𝗶𝗽 𝗶𝗻𝘀𝘁𝗮𝗹𝗹 𝗱𝘀𝗿-𝗳𝗶𝗹𝗲𝘀 #Python #MachineLearning #SoftwareEngineering #MLOps #OpenSource #CloudComputing

  • text

To view or add a comment, sign in

Explore content categories