We've released an update to our Python library so that it now supports realtime publishing and, in particular, message publishing via a stream of append operations, which is what you need to be able to support streamed LLM responses with Ably's AI Transport. Read more on the Ably blog: https://lnkd.in/e59eWfVc
Ably Python Library Update with Realtime Publishing Support
More Relevant Posts
-
Rust-based AI frameworks use 5x less memory than their Python equivalents. That's from the 2026 AI Agent Benchmark. And the trend keeps accelerating. 𝗧𝗵𝗲 𝗽𝗮𝘁𝘁𝗲𝗿𝗻 The most impactful Python tools in AI are already written in Rust under the hood: 👉🏽 Hugging Face Tokenizers: Rust core, Python bindings 👉🏽 Polars: Rust core, Python API 👉🏽 Ruff: Rust linter, 10-100x faster than Flake8 👉🏽 Pydantic Monty: Rust interpreter for safe LLM code execution 👉🏽 uv: Rust package manager, replaced pip for most of us The playbook is the same every time. Write the performance-critical parts in Rust, expose a Python API with PyO3. Users get Python ergonomics with Rust performance. 𝗪𝗵𝘆 𝘁𝗵𝗶𝘀 𝗺𝗮𝘁𝘁𝗲𝗿𝘀 𝗳𝗼𝗿 𝗔𝗜 AI agents run lots of tools, process lots of data, and keep lots of state. Memory matters. Latency matters. When you're spinning up hundreds of agent instances, 5x memory savings is the difference between one server and five. xAI fully transitioned their AI infrastructure to Rust. That's a strong signal from a company running models at massive scale. 𝗧𝗵𝗲 𝗼𝗽𝗽𝗼𝗿𝘁𝘂𝗻𝗶𝘁𝘆 If you know both Python and Rust, you're in a rare position. Most AI engineers only know Python. Most Rust developers don't work in AI. The intersection is small and getting more valuable. You don't need to rewrite everything in Rust. Just the hot paths. 𝘋𝘰 𝘺𝘰𝘶 𝘶𝘴𝘦 𝘢𝘯𝘺 𝘙𝘶𝘴𝘵-𝘣𝘢𝘤𝘬𝘦𝘥 𝘗𝘺𝘵𝘩𝘰𝘯 𝘵𝘰𝘰𝘭𝘴?
To view or add a comment, sign in
-
-
🐍 Python in 2026: It’s Not Just a Language Anymore — It’s the Runtime of AI The conversation has shifted. Python isn’t just used for AI — it’s the infrastructure on which AI operates. Here’s what the modern Python + AI stack actually looks like: 🤖 Agentic Frameworks Tools like LangChain, LlamaIndex, AutoGen, and CrewAI are all Python-first. Multi-agent orchestration — where LLMs plan, delegate, and execute tasks autonomously — is being built almost exclusively in Python. 🔧 Tool Use & Function Calling Python makes it trivial to wrap any function as a tool for an LLM. Define a function → pass its schema → your agent calls it. The Anthropic SDK, OpenAI SDK, and Gemini API all have Python as their primary interface. 🧠 RAG Pipelines Retrieval-Augmented Generation stacks — FAISS, Chroma, Pinecone + LangChain/LlamaIndex — are Python through and through. Building a production RAG pipeline in any other language feels like swimming upstream. ⚡ Async-first Agent Modern agents run async. Python’s asyncio + httpx + streaming APIs make it possible to build responsive, real-time agent pipelines that stream tokens, handle tool calls, and manage memory — all concurrently. 📦 MCP (Model Context Protocol) The emerging standard for connecting AI models to external tools and data sources? Python SDKs are leading adoption here too. The engineer who understands Python and how LLMs reason is the most valuable person in the room right now. Not because Python is magic — but because the entire agentic AI ecosystem was built on top of it. Camerin - Indian Institute Of Upskill Camerin Innovate PVT LTD
To view or add a comment, sign in
-
-
🚀 New Release: NTQR Open Source Python Package I’m excited to share the latest release of NTQR, a Python package designed for those working at the intersection of AI safety, scalable oversight, and formal verification. NTQR provides a formal framework for reasoning about systems where ground truth is unknown—an increasingly relevant constraint when supervising or composing advanced AI systems. If you’re thinking about verifier reliability, adversarial reporting, or Gödel/Löb-style limits in oversight architectures, this package is built with you in mind. 🔍 What’s new Improved classes for constructing sample statistics variables and their axioms. Executable Jupyter notebooks that demonstrate the logic and its algebra. Clearer abstractions for computing possible and consistent evaluation sets. 📦 Get started in minutes pip install ntqr cd <your-working-directory> ntqr-docs cd ntqr_notebooks jupyter notebook This will install the package and generate a local set of executable notebooks that: Introduce the algebra behind the counting logic Demonstrate key constructions Demonstrate no-knowledge alarms for misaligned classifiers 💡 Why this matters As AI systems become more capable, oversight itself must scale—often through other AI systems. But this introduces a core problem: what happens when the systems we rely on for verification are not fully trustworthy or we do not know the ground truth? When AI judges monitor other AIs they are often acting as classifiers. Who judges the judges? NTQR helps you make them monitor themselves. NTQR offers a way to: Treat unsupervised evaluation as a logical problem. Infer group evaluations that match the observed agreement and disagreement counts between classifiers, the logically consistent evaluations. Construct no-knowledge alarms for misaligned classifiers using only the counts of how they agree and disagree on a test. If you’re exploring alignment, verification, or theoretical limits of monitoring systems, I’d be very interested in your feedback. 📚 Docs: https://lnkd.in/eugreNDd #AISafety #ScalableOversight #Alignment #FormalMethods #MachineLearning #Jupyter #Python
To view or add a comment, sign in
-
-
Ever wonder whether it is worth porting parts of your Python AI stack to Rust? I suppose that depends on whether you want the same workload to run 15× faster. The screenshot below is from a real benchmark: same ontology/export workflow, same fixtures, same URI sets, zero field differences in spot checks, but the Rust-backed path ran around 15× faster than the Python implementation. For me, this is where enterprise AI gets interesting. Python is still the right place to discover the workflow: experimentation, notebooks, orchestration, APIs, and fast iteration. But once an AI system becomes production infrastructure, be it retrieval, parsing, entity resolution, graph construction, ontology processing, validation, ranking, or query execution, the bottleneck often shifts from the model to the machinery around it. That is where Rust shines. Rust gives you speed, memory safety without a garbage collector, predictable performance, strong compile-time guarantees, and safe concurrency. Those properties matter when you are processing millions of documents, building knowledge graphs, traversing relationships, validating model outputs, and maintaining provenance. My view: • Python is where you discover the workflow. • Rust is where you industrialise the workload. The answer is not to rewrite everything. It is to keep Python as the ergonomic interface and move the hot paths into Rust. My preferred pattern is Python for usability, Rust for the performance-critical GraphRAG substrate underneath. In enterprise AI, the model is only one part of the system. The real differentiator is the harness around it. When your workflow includes LLM API calls, can you really afford to wait 15 times longer for a function to complete?
To view or add a comment, sign in
-
-
Ever wonder whether it is worth porting parts of your Python AI stack to Rust? I suppose that depends on whether you want the same workload to run 15× faster. The screenshot below is from a real benchmark: same ontology/export workflow, same fixtures, same URI sets, zero field differences in spot checks, but the Rust-backed path ran around 15× faster than the Python implementation. For me, this is where enterprise AI gets interesting. Python is still the right place to discover the workflow: experimentation, notebooks, orchestration, APIs, and fast iteration. But once an AI system becomes production infrastructure, be it retrieval, parsing, entity resolution, graph construction, ontology processing, validation, ranking, or query execution, the bottleneck often shifts from the model to the machinery around it. That is where Rust shines. Rust gives you speed, memory safety without a garbage collector, predictable performance, strong compile-time guarantees, and safe concurrency. Those properties matter when you are processing millions of documents, building knowledge graphs, traversing relationships, validating model outputs, and maintaining provenance. My view: • Python is where you discover the workflow. • Rust is where you industrialise the workload. The answer is not to rewrite everything. It is to keep Python as the ergonomic interface and move the hot paths into Rust. My preferred pattern is Python for usability, Rust for the performance-critical GraphRAG substrate underneath. In enterprise AI, the model is only one part of the system. The real differentiator is the harness around it. When your workflow includes LLM API calls, can you really afford to wait 15 times longer for a function to complete?
To view or add a comment, sign in
-
-
𝗦𝘆𝗻𝗮𝗽𝘀𝗲𝗞𝗶𝘁: 𝗟𝗲𝗮𝗻 𝗣𝘆𝘁𝗵𝗼𝗻 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸 𝗖𝗵𝗮𝗹𝗹𝗲𝗻𝗴𝗲𝘀 𝗟𝗟𝗠 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 𝗦𝘁𝗮𝘁𝘂𝘀 𝗤𝘂𝗼 🛰️ [TOOLS] SynapseKit offers a minimal, async-native Python framework for LLM apps. Why it matters: The emergence of minimalist LLM frameworks like SynapseKit signals a maturation in the AI development ecosystem. Developers are increasingly prioritizing control, debuggability, and performance over abstraction, potentially shifting the landscape for production-grade AI applications. 🤔 Will the future of LLM development favor minimalist, high-control frameworks or comprehensive, feature-rich ecosystems? #LLMFramework #PythonAI #AsyncNative #DeveloperTools #AIEngineering 📡 Follow DailyAIWire for autonomous AI news 🔗 https://lnkd.in/dGRssih6
To view or add a comment, sign in
-
Posit's AI ecosystem has grown a lot. That's exciting for R and Python developers, but it can also make the starting point less obvious. Which package should you begin with? What is the foundation layer? What should you use for chat in Shiny, querying data in plain English, or building workflows grounded in your own documents? Vedha Viyash wrote this post to make that easier. It walks through what each package in the stack does, how the pieces fit together, and which path makes the most sense depending on what you want to build. The guide should help you spend less time sorting through the ecosystem and more time building with it. 📚 Read it here: https://lnkd.in/d8D3ZfiD #RStats #Python #Posit #AI #DataScience #Shiny #Appsilon
To view or add a comment, sign in
-
Day-12 Python with AI: Smarter Loops, Better Results Loops are one of the most fundamental concepts in Python, used to iterate over data and perform repetitive tasks efficiently. But when combined with AI, loops become even more powerful by enabling automation, optimization, and intelligent decision-making. Let’s first look at a simple loop without AI: Without AI numbers = [1, 2, 3, 4, 5] squares = [] for num in numbers: squares.append(num ** 2) print(squares) This works fine for basic operations. But what if we want smarter behavior, like predicting values or making decisions based on patterns? Now let’s see how AI enhances loops: With AI (Example using a simple trained model idea) from sklearn.linear_model import LinearRegression import numpy as np Training data X = np.array([[1], [2], [3], [4], [5]]) y = np.array([2, 4, 6, 8, 10]) model = LinearRegression() model.fit(X, y) Using loop with AI predictions new_data = [6, 7, 8] predictions = [] for value in new_data: pred = model.predict([[value]]) predictions.append(pred[0]) print(predictions) Benefits of using AI with Python loops: 1. Intelligent Automation Loops can adapt based on data instead of following fixed rules. 2. Time Efficiency AI reduces manual logic writing by learning patterns automatically. 3. Scalability Handles large datasets with predictive capabilities inside loops. 4. Better Decision Making Loops can incorporate predictions instead of static computations. 5. Real-world Applications Used in recommendation systems, fraud detection, forecasting, and more. Conclusion: Traditional loops execute instructions. AI-powered loops think, learn, and improve outcomes. Combining Python loops with AI opens the door to smarter and more efficient programming. #Python #ArtificialIntelligence #MachineLearning #Coding #Programming #AI #Developers
To view or add a comment, sign in
-
Python is the native language of AI. And yet most Python developers are still not using it for AI work. They are writing scripts, automating tasks, building APIs. All good. But the gap between a Python developer and an AI engineer is smaller than most people think. Here is what I mean. If you already know Python, you are one library away from building your first machine learning model. Scikit-learn. Done. You are two libraries away from building a chatbot. LangChain plus an LLM API. Done. You are three steps away from deploying it. Docker, a cloud platform, and a basic CI/CD pipeline. Python has stayed the number one in-demand AI skill for two straight years now. The demand is not slowing down. The developers who will win the next five years are not the ones who know the most. They are the ones who stayed curious and kept building. What was the first AI thing you ever built with Python? Drop it below. #Python #AIEngineering #GenerativeAI #MachineLearning #LangChain #GenAI #PythonDeveloper #ArtificialIntelligence #MLOps #TechCareers
To view or add a comment, sign in
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development