Async Def in FastAPI: Understanding the Performance Myth

I used to think async def = concurrency. Turns out it's a promise I have to keep. Here's what clicked for me about FastAPI performance 👇 First, the mental model I had wrong: I thought async def meant "spread 40 requests across 40 threads and run them in parallel." Nope. Async doesn't use threads at all. The event loop runs everything on one thread and rapidly switches between requests at every await point. The actual waiting (DB, network) happens outside Python , so thousands of requests can be "in-flight" on a single thread. ❌ Wrong: 40 requests → 40 threads in parallel  ✅ Right: 40 requests → 1 thread juggling them at every await Now the trap I almost fell into: If your endpoint has a blocking call (like requests.get() or a sync DB query), using async def is actually worse than plain def. 🔹 Plain def: FastAPI offloads to a threadpool (~40 threads). Slow requests run on separate threads. Event loop stays free. Free concurrency. ✅ 🔹 async def with blocking code inside: No await to pause at → the one thread freezes → entire event loop dies. Every other request waits. ❌ My decision guide now: 🔸 No I/O? → def  🔸 I/O with async libraries (httpx, asyncpg)? → async def + await  🔸 I/O but only blocking libraries (requests, psycopg2)? → def  🔸 Heavy CPU work? → neither. Use multiple workers or a task queue. The golden rule: async def is a promise to the event loop that you'll only do non-blocking work. If you can't keep that promise, don't make it. The trap: devs see "async = fast" and slap async def everywhere. But async without real await, is just a slower version of sync. Write async only when you can actually be async. 😀 #Python #FastAPI #WebDevelopment #BackendEngineering

To view or add a comment, sign in

Explore content categories