ASGI Changed Everything — Understanding Event Loops & Concurrent Requests

Most developers learn async backwards.

They start with:

async def hello():

without understanding:

  • what problem async solves
  • why event loops exist
  • what concurrency actually means

FastAPI only makes sense once you understand the engine underneath it.

And that engine is ASGI.


The Real Problem Was Never Speed

This is the first mental model to fix.

Async is not primarily about making your code execute faster.

It is about:

making better use of waiting time

Modern backend systems spend enormous amounts of time waiting for:

  • databases
  • external APIs
  • Redis
  • file systems
  • network responses

During this waiting period, CPUs are often idle.

Traditional synchronous systems waste this time.

ASGI-based systems try to utilize it.


What Happens in a Synchronous Server

Imagine a traditional WSGI worker handling this:

data = requests.get("https://api.example.com")

The worker waits.

And waits.

And waits.

Until the response arrives.

During this time:

  • the worker cannot process another request
  • memory remains occupied
  • throughput decreases

The worker is blocked.


Async Changes The Waiting Model

Now imagine this:

data = await client.get("https://api.example.com")

This changes everything.

Instead of blocking the worker, the task tells the event loop:

“I’m waiting right now. You can work on something else.”

That is the core idea behind async systems.

Not faster execution.

Smarter waiting.


Event Loop: The Hidden Engine

At the center of ASGI systems is something called:

the event loop

The event loop is essentially a scheduler.

Its job is to:

  • monitor tasks
  • pause waiting operations
  • resume completed operations
  • coordinate concurrent execution

You can think of it like an air traffic controller.

It constantly switches between tasks that are:

  • ready to run
  • waiting for I/O
  • completed

All without creating massive numbers of threads.


Concurrency Is NOT Parallelism

This is where many developers get confused.

Concurrency means:

handling multiple tasks efficiently

Parallelism means:

executing multiple tasks at the exact same time

Async systems mainly improve concurrency.

Not CPU-heavy computation.


Important Mental Model

If your application spends most of its time:

  • waiting for databases
  • calling APIs
  • reading files
  • waiting for networks

async can help significantly.

But if your application is doing:

  • image processing
  • video encoding
  • machine learning inference
  • heavy computation

async alone will not magically improve performance.

Because the CPU is already busy.


ASGI Allows Cooperative Multitasking

This is one of the biggest architectural shifts.

In synchronous systems:

  • workers are blocked

In async systems:

  • tasks voluntarily yield control

That is why this is called:

cooperative multitasking

The task itself says:

“I cannot proceed right now. Let others run.”

This makes concurrent systems dramatically more efficient for I/O-heavy workloads.


FastAPI Sits On Top of This Stack

FastAPI is not the event loop.

It is built on top of the async ecosystem.

The stack usually looks like this:

Client Request
Uvicorn (ASGI Server)
ASGI Interface
Starlette
FastAPI
Your Application Code

Each layer has a responsibility.

FastAPI itself focuses mostly on:

  • routing
  • dependency injection
  • validation
  • API structure

The real concurrency engine lives underneath.


Why Uvicorn Matters

When you run:

uvicorn main:app

Uvicorn becomes the ASGI server.

It:

  • manages the event loop
  • handles socket connections
  • schedules coroutines
  • processes concurrent requests

Without an ASGI server, FastAPI cannot operate as intended.


One Worker Can Handle Many Connections

This is one of the biggest architectural advantages.

In older synchronous systems:

1 blocked request
= 1 occupied worker

In ASGI systems:

1 worker
can manage many waiting connections concurrently.

Especially when most operations are I/O-bound.

This dramatically improves scalability for modern APIs.


Why WebSockets Became Easier

WSGI was built for short request-response cycles.

WebSockets break that model.

A WebSocket connection may stay open:

  • for minutes
  • hours
  • indefinitely

ASGI was designed for this world.

That is why FastAPI works naturally with:

  • streaming APIs
  • live dashboards
  • chats
  • notifications
  • AI token streaming

Async Has Tradeoffs Too

This is important.

Async systems are more powerful, but also more complex.

Common mistakes include:

  • blocking the event loop accidentally
  • mixing sync and async incorrectly
  • using incompatible libraries
  • creating hidden bottlenecks

Sometimes developers write async code that behaves synchronously.

Which defeats the entire purpose.


The Bigger Shift

FastAPI is part of a larger industry movement.

Infrastructure evolved toward:

  • microservices
  • streaming systems
  • real-time APIs
  • highly concurrent workloads

ASGI emerged because the old execution model could not efficiently support this future.

FastAPI simply arrived at the right architectural moment.


What’s Next

In the next post, we’ll go deeper into one of FastAPI’s most misunderstood features:

Day 3: FastAPI Dependencies Are Not Function Calls — They Are a Resolution Graph

Because Depends() is much more than parameter injection.

Internally, FastAPI is building and resolving a dependency tree for every request.

And once you understand that, you start seeing FastAPI very differently.


If this post helped clarify async architecture, stay tuned for the full Thinking in FastAPI series.