You think Python runs things in parallel. Most of the time, it doesn’t.
Today, you understand how Python handles concurrency, why it behaves the way it does, and how async actually works.
Today’s Goal
By the end of today, you will:
- Understand concurrency vs parallelism
- Learn about threads and the GIL
- Understand async/await internals
- Know when to use threading vs async vs multiprocessing
The Illusion
import threading
def task():
print("running")
threading.Thread(target=task).start()
You think:
This runs in parallel
Reality:
Python switches between threads, not true parallel execution (for CPU work)
Concurrency vs Parallelism
- Concurrency → multiple tasks making progress
- Parallelism → multiple tasks executing at the same time
Python supports concurrency well, but parallelism is limited.
The GIL (Global Interpreter Lock)
The GIL ensures:
Only one thread executes Python bytecode at a time
Why GIL Exists
- simplifies memory management
- avoids race conditions in reference counting
Impact of GIL
- CPU-bound tasks → no real speedup with threads
- I/O-bound tasks → threads can help
Example
# CPU bound
for i in range(10**7):
pass
Threads won’t speed this up.
Threading (Best for I/O)
import threading
threading.Thread(target=task).start()
Use for:
- network calls
- file I/O
Multiprocessing (True Parallelism)
from multiprocessing import Process
Process(target=task).start()
- separate processes
- separate memory
- bypasses GIL
Cost of Multiprocessing
- higher memory usage
- inter-process communication overhead
Async Programming
async def task():
return 1
What is Async?
Async is:
cooperative multitasking
Tasks voluntarily yield control.
Event Loop
event loop:
pick task
run until await
switch to next task
Await Behavior
await some_io()
- pauses coroutine
- returns control to event loop
Key Insight
Async is NOT parallel. It is efficient scheduling.
Example
import asyncio
async def task():
print("start")
await asyncio.sleep(1)
print("end")
asyncio.run(task())
Coroutine Internals
Coroutines are:
- generator-like objects
- use
yieldinternally - managed by event loop
Execution Model
call async -> create coroutine
await -> suspend
resume -> continue execution
When to Use What
- CPU-bound → multiprocessing
- I/O-bound → threading or async
- high-scale I/O → async
Why This Matters
Choosing wrong model leads to:
- poor performance
- blocked execution
- wasted resources
Your Task
- run CPU task with threads vs processes
- create async function with sleep
- observe execution order
Common Mistakes
- using threads for CPU tasks
- mixing blocking code in async
- misunderstanding GIL
Think Deeper
- why does GIL exist?
- how does event loop schedule tasks?
- when does async fail?
Subtle Insight (CRITICAL)
Concurrency is about managing waiting, not doing more work at once.
End of Series
You now understand:
- execution model
- memory model
- data structures
- hashing
- call stack
- modules
- exceptions
- iterators
- decorators
- concurrency
Final Rule
Stop writing Python blindly.
Understand what happens underneath.
That’s the difference between writing code… and engineering systems.