Skip to content
5 min read

Python async/await: A Practical Guide for Backend Developers

Understand how Python's async/await works, when to use asyncio, and how to write concurrent backend code without the confusion of threads.

#python #backend #async #performance

Python async/await: A Practical Guide for Backend Developers

If you’ve written Python for any amount of time you’ve probably seen async def and await and wondered exactly what they do. This guide explains Python async/await from first principles and shows you where it’s actually useful in backend development.

The Problem: Waiting Is Expensive

Most backend code spends a lot of time waiting — for a database query, an HTTP call, a file read. During that wait, a normal synchronous function blocks the entire thread.

With async/await, your code can do other work while waiting for slow operations. One thread can handle thousands of concurrent operations as long as they’re I/O bound.

The Event Loop

asyncio runs your code inside an event loop — a loop that checks what work is ready to run. When an async function hits an await, it hands control back to the event loop, which can run another coroutine while the first waits.

import asyncio

async def fetch_user(user_id: int) -> dict:
    await asyncio.sleep(0.1)  # simulates a database query
    return {"id": user_id, "name": "Alice"}

async def main():
    user = await fetch_user(1)
    print(user)

asyncio.run(main())

asyncio.run() creates the event loop, runs main(), and closes it when done.

Coroutines vs Functions

async def defines a coroutine function. Calling it returns a coroutine object — it doesn’t execute immediately:

result = fetch_user(1)       # Nothing runs yet — just a coroutine object
result = await fetch_user(1) # Now it runs

Always await coroutines or they do nothing.

Running Tasks Concurrently

The real power comes from running multiple I/O operations at the same time:

import asyncio
import httpx

async def fetch_url(client: httpx.AsyncClient, url: str) -> str:
    response = await client.get(url)
    return response.text

async def main():
    urls = [
        "https://api.example.com/users/1",
        "https://api.example.com/users/2",
        "https://api.example.com/users/3",
    ]
    
    async with httpx.AsyncClient() as client:
        # Run all three requests concurrently
        results = await asyncio.gather(*[fetch_url(client, url) for url in urls])
    
    return results

Without asyncio.gather, these three requests would run sequentially. With it, they run concurrently — if each takes 200ms, the total is ~200ms instead of ~600ms.

asyncio.gather vs asyncio.TaskGroup

In Python 3.11+, TaskGroup is the preferred way to manage concurrent tasks — it handles cancellation and errors more cleanly:

async def main():
    async with asyncio.TaskGroup() as tg:
        task1 = tg.create_task(fetch_user(1))
        task2 = tg.create_task(fetch_user(2))
        task3 = tg.create_task(fetch_user(3))
    
    # All tasks are done here
    print(task1.result(), task2.result(), task3.result())

If any task raises an exception, the TaskGroup cancels all remaining tasks.

When to Use async/await

Use it when:

  • Your code makes many concurrent I/O calls (HTTP, database, file)
  • You’re using an async framework (FastAPI, aiohttp, Starlette)
  • Latency matters and you’re I/O bound

Don’t use it when:

  • You’re doing CPU-heavy work (use multiprocessing instead — the event loop can’t parallelise CPU work)
  • You’re using libraries that don’t support async (mixing sync and async poorly is worse than just staying sync)
  • The code is simple and sequential — async adds complexity without benefit

Async with Databases

Most modern database drivers support async. With asyncpg or SQLAlchemy’s async engine:

import asyncpg

async def get_user(pool: asyncpg.Pool, user_id: int) -> dict:
    async with pool.acquire() as conn:
        row = await conn.fetchrow(
            "SELECT id, name, email FROM users WHERE id = $1",
            user_id
        )
        return dict(row)

async def main():
    pool = await asyncpg.create_pool(dsn="postgresql://localhost/mydb")
    user = await get_user(pool, 1)
    print(user)
    await pool.close()

Common Mistakes

Blocking the event loop — calling a synchronous, CPU-blocking function inside an async function stalls everything:

async def bad():
    time.sleep(5)        # Blocks the entire event loop
    await asyncio.sleep(5)  # Correct — yields control

Forgetting to await — a coroutine that isn’t awaited silently does nothing. Python will warn you, but it’s easy to miss.

Using threads and asyncio together carelessly — if you must call sync code from async context, use asyncio.to_thread():

result = await asyncio.to_thread(some_blocking_function, arg1, arg2)

FastAPI: async in Practice

FastAPI is built on async and it’s where most Python backend developers first use it seriously:

from fastapi import FastAPI
import httpx

app = FastAPI()

@app.get("/user/{user_id}")
async def get_user(user_id: int):
    async with httpx.AsyncClient() as client:
        response = await client.get(f"https://api.example.com/users/{user_id}")
    return response.json()

FastAPI handles the event loop for you — just define routes as async def and use async-compatible libraries.

What to Learn Next

  • asyncio documentation — comprehensive reference
  • httpx for async HTTP requests
  • asyncpg or SQLAlchemy async for database access
  • aiofiles for async file I/O

Once you get comfortable with async/await, Python concurrent I/O code becomes significantly cleaner than thread-based alternatives.

Kaikobud Sarkar

Kaikobud Sarkar

Software engineer passionate about backend technologies and continuous learning. I write about Python frameworks, cloud architecture, engineering growth, and staying current in tech.

Related Articles