Python was the first programming language I learned, but for me it fell by the wayside years ago. I’m now re-learning it specifically because it seems to be a required skill in the new generation of “AI” companies. So - genuine question for technical folks building AI companies: If your backend is just routing prompts to Anthropic or OpenAI — you're not doing ML. You're doing API calls. So why Python? If you're not training models, if you're not running local inference, you have no NumPy pipelines or CUDA kernels…why on earth Python? Golang gives you compiled performance, tiny binaries, and dead-simple concurrency. Node/TypeScript unifies your entire engineering team under one language and toolchain. There are plenty of other options. Python made sense when once upon a time but now? Not so sure. If your company adds value while still being essentially an AI passthrough - is your stack a technical decision?
Why Python for AI Companies if Not Training Models
More Relevant Posts
-
Let me see if I can fix this headline for the developers here: "Company that has yet to make a profit and projects to have significant losses for the foreseeable future buys critical rust-driven python toolmaker which most data scientists, data engineers, and software engineers now depend on." If you work in Python, you probably use some tools made by Astral (e.g.: `uv`, `Ruff`). While the CEO restates his comment to their mission of providing high quality high-performance tools, I'm skeptical that this isn't a bad thing for the Python community. https://lnkd.in/gNVZgSF2
To view or add a comment, sign in
-
Python just lost its crown on GitHub. For the first time, TypeScript is officially the most-used programming language in the world. But the reason why is absolutely wild. It wasn't a human decision. It was an AI decision. • AI loves rules: TypeScript has strict typing. This makes it incredibly easy for AI tools like GPT-5.5 and Claude to write, debug, and refactor code without making mistakes. • The death of "vibe coding": Python is still king for AI research, but for actual production software, developers are pivoting to whatever language the AI reads best. We are officially designing our systems for machines to read, not humans. "AI-legible" is the new standard. If AI tools code 10x faster in TypeScript than in Python, you’re going to use TypeScript. It’s that simple. What language do you think AI will force us to adopt next ?
To view or add a comment, sign in
-
This isnt a Golang vs Python tiff. When you are building for scale a small tech decision can make or break your finances. Especially when the margins are very thin. RapidaAI
"Why Go? The entire AI ecosystem is in Python." Every CTO evaluating Rapida asks this. It is the right question. Here is why we made that tradeoff. A voice call processes 50 audio frames per second. Each frame is 20ms. If your runtime pauses for 10ms to collect garbage, that is an audible glitch. Not a metric. A glitch your user hears. Python's GC can hit 10-50ms pauses under allocation-heavy load. Go's GC typically stays in microsecond-level pauses. That is the entire argument in one line. But the numbers go deeper. Every concurrent call in our pipeline spawns 24 goroutines at peak. Four priority dispatchers, RNNoise denoiser, Silero VAD, STT streamer, LLM streamer, TTS streamer, recording, session manager, transport handler, lifecycle hooks, and auxiliary workers. Goroutines start at KB-scale stacks. Python concurrency, whether threads or async, carries significantly higher overhead per task. We benchmarked both on a c8gn.2xlarge (8 vCPU, 16 GiB). At 487 concurrent calls: - Total RSS. Go: 461 MB. Python: 4.53 GB. - CPU. Go: 54%. Python: 93%. Python also serializes CPU-bound work under the GIL. - Heap allocs on the hot path. Go: 0 (sync.Pool). Python: 3-5 objects per frame. At 1,843 concurrent calls, Python needs 15.8 GB. Go does it in 1.63 GB. On the same hardware. Yes, we gave up the Python ecosystem. Every LLM library, every STT SDK, every sample repo. We built 38 provider integrations from scratch. 12 STT. 15 TTS. 11 LLM. That cost was real. But when your runtime is processing 24,000 audio frames per second, the language is not an abstraction you can swap later. It is the foundation everything else sits on. If this is useful, star the repo. It helps more engineers find it. https://lnkd.in/gqhX6RHN
To view or add a comment, sign in
-
-
RLHF is evolving toward harness feedback. We’ve spent the last few years duct-taping LLMs together with Python. Prompts, retry loops, tool wrappers, control flow. Useful, but most of the logic lived in code, not in the model. What’s changing is where the model learns from. Pre-training gave models language. RLHF (reinforcement learning from human feedback) grounded them in human judgment. RLAIF (reinforcement learning from AI feedback) scaled that signal using models to evaluate models. Now we are seeing a third source of feedback. Harness feedback. The source of feedback is expanding from humans, to models, to environments. Think of a codebase with tests, a math verifier, or a sandbox where each step must actually work. This is not a single reward at the end. It is an execution trace: A failed test A compiler error An invalid sequence of actions A constraint violation The model sees what happened at each step. On the surface, this looks like standard RL with an environment. The difference is how much of the trajectory the model gets to see. The environment exposes failure and progress step by step. That changes what the model learns. It learns which trajectories hold up inside a real system. This shows up in both training and inference. During training, the harness provides dense feedback over multiple rollouts. During inference, the same environment validates steps and filters out bad paths. The same harness shapes the model during training and constrains it during execution. The unit of learning shifts from isolated outputs to full trajectories. Each attempt, failure, correction, and completion contributes signal. As this continues, more of the logic we currently write around models gets absorbed into the model itself.
To view or add a comment, sign in
-
Rust-based AI frameworks use 5x less memory than their Python equivalents. That's from the 2026 AI Agent Benchmark. And the trend keeps accelerating. 𝗧𝗵𝗲 𝗽𝗮𝘁𝘁𝗲𝗿𝗻 The most impactful Python tools in AI are already written in Rust under the hood: 👉🏽 Hugging Face Tokenizers: Rust core, Python bindings 👉🏽 Polars: Rust core, Python API 👉🏽 Ruff: Rust linter, 10-100x faster than Flake8 👉🏽 Pydantic Monty: Rust interpreter for safe LLM code execution 👉🏽 uv: Rust package manager, replaced pip for most of us The playbook is the same every time. Write the performance-critical parts in Rust, expose a Python API with PyO3. Users get Python ergonomics with Rust performance. 𝗪𝗵𝘆 𝘁𝗵𝗶𝘀 𝗺𝗮𝘁𝘁𝗲𝗿𝘀 𝗳𝗼𝗿 𝗔𝗜 AI agents run lots of tools, process lots of data, and keep lots of state. Memory matters. Latency matters. When you're spinning up hundreds of agent instances, 5x memory savings is the difference between one server and five. xAI fully transitioned their AI infrastructure to Rust. That's a strong signal from a company running models at massive scale. 𝗧𝗵𝗲 𝗼𝗽𝗽𝗼𝗿𝘁𝘂𝗻𝗶𝘁𝘆 If you know both Python and Rust, you're in a rare position. Most AI engineers only know Python. Most Rust developers don't work in AI. The intersection is small and getting more valuable. You don't need to rewrite everything in Rust. Just the hot paths. 𝘋𝘰 𝘺𝘰𝘶 𝘶𝘴𝘦 𝘢𝘯𝘺 𝘙𝘶𝘴𝘵-𝘣𝘢𝘤𝘬𝘦𝘥 𝘗𝘺𝘵𝘩𝘰𝘯 𝘵𝘰𝘰𝘭𝘴?
To view or add a comment, sign in
-
-
Building with LLMs in Python involves a lot of moving parts. You need to call model APIs, write prompts that produce reliable results, set up retrieval pipelines, and eventually build agents that can reason and use tools. We put together a learning path that walks through all of it, step by step: - Call LLM APIs from OpenAI, Ollama, and OpenRouter - Write effective prompts that return structured output - Build RAG pipelines with LlamaIndex, ChromaDB, and LangChain - Create AI agents using Pydantic AI and LangGraph - Connect agents to external tools and data via MCP It's aimed at Python developers who are comfortable with the language and want to start building real applications on top of language models. https://lnkd.in/ggdqNgNu
To view or add a comment, sign in
-
Is Python finally getting a real competitor? For years, Python programming language has dominated everything from AI to backend to scripting — largely because of its simplicity, readability, and massive ecosystem But something interesting is happening… 👀 A new wave of languages and tools are emerging that challenge Python’s biggest weakness: 👉 Performance vs productivity trade-off The idea isn’t to “kill Python” — it’s to reimagine what a modern language should feel like: ✔️ As easy as Python ✔️ As fast as C/C++ ✔️ Built for AI-first workflows ✔️ Better developer ergonomics And honestly… this shift was inevitable. Python was designed in the late 80s to be fun and easy to use But today’s world demands: ⚡ Real-time AI systems ⚡ High-performance computing ⚡ Massive-scale data pipelines So the big question is: 👉 Will Python evolve fast enough? 👉 Or will the next-gen language take over the developer mindshare? 💡 My take: Python isn’t going anywhere. But the monopoly? That might be ending. We’re entering a multi-language era, where developers pick tools based on: Speed Scalability Developer experience And that’s actually a good thing. Because competition doesn’t kill ecosystems… 👉 It makes them better. 🔥 Curious to hear your thoughts: Do you think Python will still dominate in 5 years? #Python #Programming #AI #SoftwareDevelopment #TechTrends #Developers #Coding #MachineLearning #FutureOfWork #Innovation
To view or add a comment, sign in
-
Some time ago I learned a bit of Python -- just to see what the other half of the world was doing. And here is what it showed me. No, this time I mean really, it did show me something: In programming languages everything must be precise, but you have all the time in the world (depending on how hard your boss is breathing down your neck). In natural languages, perfection is optional, but timing is ruthless. Example? Me want sandwich. Would it be possible to have a sandwich, please? Both work. Neither is "wrong". People won't scream ERROR. (Nice people.) You'll be fed. That's the best thing about a natural language. That's the worst thing about a natural language. Timing and context decide.
To view or add a comment, sign in
-
Python is the native language of AI. And yet most Python developers are still not using it for AI work. They are writing scripts, automating tasks, building APIs. All good. But the gap between a Python developer and an AI engineer is smaller than most people think. Here is what I mean. If you already know Python, you are one library away from building your first machine learning model. Scikit-learn. Done. You are two libraries away from building a chatbot. LangChain plus an LLM API. Done. You are three steps away from deploying it. Docker, a cloud platform, and a basic CI/CD pipeline. Python has stayed the number one in-demand AI skill for two straight years now. The demand is not slowing down. The developers who will win the next five years are not the ones who know the most. They are the ones who stayed curious and kept building. What was the first AI thing you ever built with Python? Drop it below. #Python #AIEngineering #GenerativeAI #MachineLearning #LangChain #GenAI #PythonDeveloper #ArtificialIntelligence #MLOps #TechCareers
To view or add a comment, sign in
-
We've released an update to our Python library so that it now supports realtime publishing and, in particular, message publishing via a stream of append operations, which is what you need to be able to support streamed LLM responses with Ably's AI Transport. Read more on the Ably blog: https://lnkd.in/e59eWfVc
To view or add a comment, sign in
Explore related topics
- Reasons to Learn AI Skills
- Why Coding Skills Matter in the AI Era
- Reasons to Learn Coding in an AI Era
- The Growing Need for AI Skills
- Reasons to Learn Programming Skills Without AI
- Top Skills for Job Seekers in AI
- Key Skills Needed for Python Developers
- AI and ML in Cloud Computing
- How to Use AI Instead of Traditional Coding Skills
- The Role of AI in Programming
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development