Python gained a natural first mover advantage in AI agent development that wasn't quite earned. Python is a great language whose intuitiveness and low ceremony are an asset to ML, but while ML is about computation and experimentation, AI is about context and structure. This is why statically typed programming languages proven in the enterprise like Java, Kotlin, C#, and TypeScript are better suited to AI than Python. But what if we didn't have to choose? After all, the prerequisite for successful AI is a data strategy with the governance to know everything you have and how to access it, and it would be amazing to leverage the rich Python ecosystem, including the vast library of Hugging Face models and its outstanding Transformers framework, to implement that strategy in a way that makes integration with a more enterprise-friendly technology like Java seamless. We're getting there. This GraalPy Spring Boot Summarization Demo on GitHub (link in comments) shows how you can leverage GraalPy to run the #Python libraries markitdown and Transformers along with the HuggingFaceTB/SmolLM2-360M model to process PDFs in a Spring Boot app written in #Java. This is super cool and I can't wait to see what's next.
How to combine Python and Java for AI development
More Relevant Posts
-
Java or Python for Building Agents? AI success isn’t about picking the trendiest language — it’s about enabling your people. Python, Java, C#, JavaScript… what matters most is letting your team build with the tools they know best. Talent + pragmatic tech decisions = competitive advantage.
To view or add a comment, sign in
-
Python — Asyncio: Write Faster I/O Without Threads Want concurrency without the headache of threads? If you’ve ever tried using threads in Python to speed up your program, you probably ran into synchronization issues, race conditions, or the infamous Global Interpreter Lock (GIL). Fortunately, Python offers a cleaner and more efficient way to achieve concurrency—asyncio. Asyncio introduces an event loop, which enables cooperative multitasking. Instead of running multiple threads that compete for CPU time, asyncio allows your program to pause one task when it’s waiting for I/O (like a network request or file read) and resume another in the meantime. This happens seamlessly using the await keyword. When a coroutine (an async function) hits an await, it yields control back to the event loop. This means your program isn’t blocked while waiting for data to arrive—it’s busy doing something else useful. That’s why asyncio is perfect for I/O-bound applications such as web servers, API clients, chat apps, or database connectors. You can scale to thousands of concurrent tasks without creating thousands of threads, making it far more memory-efficient. Combine it with libraries like aiohttp for async web requests or asyncpg for PostgreSQL database operations, and you’ll see dramatic performance improvements. Here’s the catch: asyncio isn’t magic for everything. It won’t speed up CPU-bound workloads like image processing or complex calculations, since the GIL still applies. For that, use the multiprocessing module or offload heavy work to a separate process or thread pool. To keep your async code elegant and bug-free: Use async with and async for consistently. Avoid mixing sync and async code arbitrarily. Wrap blocking calls (like traditional file I/O or CPU work) with run_in_executor() to prevent freezing the event loop. Asyncio gives you concurrency with clarity—no tangled threads, no shared-state chaos, just smooth, cooperative execution. Pro Tip: Don’t fight the event loop—embrace it. CTA: Follow and subscribe to my newsletter for more practical Python performance tips and async coding patterns. #Python #Programming #Asyncio #Threading #Concurrency #Pythonperformance
To view or add a comment, sign in
-
-
Python 3.14 Is Officially Released! The wait is over — Python 3.14 is here! Released on October 7, 2025, this version brings one of the most exciting updates in recent years — focusing on performance, concurrency, and developer experience. Here’s what makes this release special: • Free-threading mode — a major step toward removing the Global Interpreter Lock (GIL) and enabling true multi-core parallelism. • Template string literals — a cleaner, safer way to build strings dynamically. •Deferred evaluation of type annotations — improves startup performance and avoids circular imports. •Plus: Faster interpreter startup, improved debugging, and stronger security. I’ve written a detailed Medium blog explaining everything about this release — what’s new, why it matters, and whether you should upgrade right now. 👉 Read the full blog here: https://lnkd.in/grbdkK8W Python 3.14 marks the beginning of a new era — a more scalable, concurrent, and performance-driven Python for developers and AI enthusiasts alike. What are your thoughts on Python 3.14? Are you planning to test it or wait for full GIL removal in the next versions? #Python #Python314 #Programming #AI #ML #WebDevelopment #SoftwareEngineering #MediumBlog
To view or add a comment, sign in
-
-
𝗣𝘆𝘁𝗵𝗼𝗻 𝟯.𝟭𝟰: 𝗙𝗶𝗻𝗮𝗹𝗹𝘆, 𝗬𝗼𝘂 𝗖𝗮𝗻 𝗗𝗶𝘀𝗮𝗯𝗹𝗲 𝘁𝗵𝗲 𝗚𝗜𝗟! Big news for Python devs: Python 3.14 lets you turn off the Global Interpreter Lock (GIL) - a historic step in the language. --- What’s the GIL? The Global Interpreter Lock (GIL) prevents true multi-threading in standard Python: even with multiple threads, only one executes Python code at a time. It’s been a pain for devs building high-performance or parallel apps. What’s new in Python 3.14? • You can now run Python without the GIL! • Multiple threads can finally run real Python code in parallel on multiple CPU cores. Which means... • Multi-threaded code (e.g., concurrent web servers, data crunching, agent apps) gets a major speedup -> no more C extensions/hacks needed. • You can better use multi-core hardware: just like Java, C++, and Go. --- How to use it (very simply): • With Python 3.14 the default interpreter build remains the traditional GIL-enabled version, so existing Python code and libraries should work as before. • If you’re working on new parallel or CPU-bound threading workloads, you can optionally install or build the free-threaded (GIL-disabled) version of Python. Caveats: Not all third-party libraries are yet fully compatible with the GIL-free build. Also, single-threaded workloads may run slightly slower in this build, so the benefit is primarily for multi-threaded, core-saturating tasks. --- Overall: Python 3.14 lets you choose:- classic simplicity or full-power concurrency. It makes Python more future-proof for fast, modern applications. ♻️ Share it with your network if you find it useful, and follow Mayank Sultania for more practical AI tips. Video by: DailyDoseofDS.com #Python #Concurrency #GIL #Python314 #Developers #Performance
To view or add a comment, sign in
-
For decades, the Global Interpreter Lock (GIL) has been both a blessing and a bottleneck — keeping Python simple and safe for single-threaded programs, but limiting its ability to scale across multiple CPU cores. Now that Python 3.14 officially removes the GIL, the language takes a massive leap toward true parallelism. That means faster performance, better concurrency, and a future where Python can finally start to close the performance gap with C++, Java, and Rust in multi-threaded workloads. In my latest article for Towards Data Science, I break down: - How to get your hands on GIL-free Python - What “no GIL” actually means in practice - How it impacts performance, libraries, and existing code using several examples - Why this marks a new era for AI, data science, and web development in Python Check it out here 👉 https://lnkd.in/e4TTMphY
To view or add a comment, sign in
-
🤖 AI Agent as a runtime I'm testing a shift in how I think about AI coding agents: treat them as a runtime environment for markdown. Write your intent in markdown. Let the AI agent execute it—just like Java runs on the JVM or Python runs on an interpreter. The AI reads the markdown, generates whatever code is needed (shell, Python, Node.js), and runs it. Need to call another script? Just say "run the xyz.sh script"—it works like an import statement. As long as the output is stable, the implementation language doesn't matter. Markdown as your source code. AI as your runtime. Simple as that.
To view or add a comment, sign in
-
Ever made your Python code “concurrent” and still felt it’s running like a snail? Yeah… same. 😅 That’s because Python gives you multiple ways to do concurrency — but each one solves a different problem. Earlier I was using native threads but soon realized there is now a better way for coordation. Let’s make this simple 👇 🧵 1️⃣ ThreadPoolExecutor — When your code is mostly waiting Think of API calls, database queries, or file reads. Threads let your program handle multiple “waiting” tasks at once instead of blocking everything. 👉 It’s like calling 10 people and putting them all on hold together instead of one by one. ⚙️ 2️⃣ ProcessPoolExecutor — When your code is burning CPU This one actually uses multiple cores. Great for math-heavy or data processing tasks where the CPU is sweating. It’s like having 4 chefs in 4 kitchens instead of 1 chef doing everything alone. ⚡ 3️⃣ asyncio.create_task — When you want async I/O magic This is pure event-loop goodness — no threads, no processes. It just switches super fast between tasks that are waiting on I/O. 👉 It’s like cooking multiple dishes — while one simmers, you chop the veggies for the next. 🔄 4️⃣ asyncio.to_thread — When async meets blocking code Sometimes you’re in an async app but still need to run an old, blocking function. This lets you sneak it into a background thread without freezing everything. 👉 It’s like having someone else hold your place in line while you grab a coffee. ☕ 💡 Quick guide: 🧵 ThreadPoolExecutor → I/O bound (lots of waiting) ⚙️ ProcessPoolExecutor → CPU bound (heavy lifting) ⚡ asyncio.create_task → Async I/O (event loop magic) 🔄 asyncio.to_thread → Mix sync with async safely Threads handle waiting. Processes handle computation. Async handles coordination. Once you get this mental model — concurrency in Python finally clicks. #concurrency #python #asynchronous #learning
To view or add a comment, sign in
-
#AI app with #JAVA or #PYTHON? 🤔 As a Java developer, I often wondered - why not build an AI app in Java instead of Python? Recently, I got chance to explore this question while working on an AI-based project. My first choice was Java but my senior suggested going with Python. As the project progressed, I realised it was the right call. The AI ecosystem from libraries to frameworks and community support is far more mature and developer friendly in Python. 💡 I’m curious, what’s your take? Can Java bridge the gap, or will Python continue to dominate the AI space?
To view or add a comment, sign in
-
-
Why Python? Because it's the Swiss Army knife of programming languages! While Java makes you write novels, Python lets you write poetry. Where JavaScript gets tangled in its own syntax, Python stays clean and readable. R is brilliant for statistics but struggles beyond data analysis, and SQL speaks only to databases. Python bridges ALL these gaps: • Web development? ✅ (Django, Flask) • Data science? ✅ (Pandas, NumPy) • Automation? ✅ (Beautiful scripting) • Machine learning? ✅ (TensorFlow, PyTorch) • Database work? ✅ (SQLAlchemy integration) Here's the kicker: major companies like Google, Netflix, and Instagram run on Python. It's not just beginner-friendly—it's industry-proven. Ready to join the Python revolution? Your future self will thank you for choosing the language that grows with your ambitions! #PythonProgramming #CodingJourney #MediaPilot
To view or add a comment, sign in
-
-
A couple of days ago, I dived deep into Python’s concurrency model — and honestly, it cleared up a lot of confusion I had around parallelism vs concurrency, back when I was trying to understand the same concepts via projects. Here’s what I learned: 🧠 Multiprocessing: Runs multiple processes in parallel, each with its own Python interpreter. Great for CPU-heavy tasks since it bypasses the Global Interpreter Lock (GIL) by using multiple cores. 🧵 Threading: Multiple threads run within a single process. Works well for I/O-heavy tasks like file downloads, but is still limited by the GIL for CPU-bound operations. 🌐 AsyncIO: Perfect for handling many I/O-bound tasks concurrently using a single thread, ideal for tasks like API calls or handling multiple connections simultaneously. 🔒 Global Interpreter Lock (GIL): This was fascinating to learn about. It exists due to historical reasons, mainly to make integrating C extensions easier back when Python was being ported from C, aka CPython. Essentially, it allows only one thread to execute Python bytecode at a time, simplifying memory management but limiting true parallelism. And guess what? I already applied these concepts a few years ago in a side project: a YouTube Playlist Downloader that uses thread-based parallelism for faster downloads. 🖥️ Check it out here: https://lnkd.in/gsSKQV_X It was such a fun project to work on, and it made all these concurrency concepts finally click in practice. I am revisiting these concurrency concepts just to brush up on the Python concepts. #python #learning
To view or add a comment, sign in
More from this author
Explore related topics
- How to Choose the Best AI Agent Framework
- Benefits of AI in Software Development
- The Role of AI in Programming
- How to Use AI to Make Software Development Accessible
- Top AI-Driven Development Tools
- How to Use Python for Real-World Applications
- Reasons for Developers to Embrace AI Tools
- Reasons to Learn Programming Skills Without AI
- How to Use AI Instead of Traditional Coding Skills
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
The GitHub repo as promised: https://github.com/fniephaus/graalpy-spring-boot-summarize