You think Python processes everything immediately. It doesn’t.

Python can be lazy.

Today, you understand how iteration really works under the hood.


Today’s Goal

By the end of today, you will:

  • Understand iterator protocol
  • Learn how generators work internally
  • See how lazy execution saves memory
  • Avoid unnecessary allocations

The Illusion

for i in [1,2,3]:
    print(i)

You think:

Python loops over the list directly

Reality:

Python asks for an iterator and pulls values one by one


Iterator Protocol

An object is iterable if it implements:

  • __iter__()
  • __next__()

Example Iterator

class Counter:
    def __init__(self, n):
        self.n = n
        self.i = 0

    def __iter__(self):
        return self

    def __next__(self):
        if self.i < self.n:
            val = self.i
            self.i += 1
            return val
        raise StopIteration

How For Loop Works

it = iter(obj)
while True:
    try:
        x = next(it)
    except StopIteration:
        break

Key Insight

Iteration is pull-based, not push-based.


Generators

def gen():
    yield 1
    yield 2

Generator Internals (Under the Hood)

When a generator is created, Python builds a generator object that contains:

  • function code
  • execution frame
  • local variables
  • instruction pointer
g = gen()
print(type(g))

Frame Object

g = gen()
print(g.gi_frame)

Frame stores execution state and locals.


Suspension Model

call generator -> create frame
next() -> run until yield
pause -> save state
next() -> resume


Key Insight

Generators are stateful suspended functions.


What yield Does

  • pauses function
  • saves state
  • resumes later

Generator vs Function

# normal
return value

# generator
yield value

Memory Advantage

[i for i in range(1_000_000)]
(i for i in range(1_000_000))

List allocates everything. Generator produces one at a time.


Generator Expression

gen = (i*i for i in range(10))

Lazy Execution

Values computed only when needed.


One-Time Consumption

g = (i for i in range(3))

list(g)
list(g)

Second call returns empty.


yield from

def g1():
    yield from [1,2,3]

Why This Matters

Generators help with:

  • large datasets
  • streaming data
  • memory efficiency

Real Example

def read_lines(file):
    for line in file:
        yield line

Performance Insight

Generators:

  • lower memory
  • slightly slower per item

Your Task

  • implement custom iterator
  • compare list vs generator memory
  • observe lazy evaluation

Common Mistakes

  • converting generators to list unnecessarily
  • assuming reuse
  • ignoring lazy behavior

Think Deeper

  1. how does generator store state?
  2. why memory efficient?
  3. when to use list instead?

Subtle Insight (CRITICAL)

Generators turn computation into a stream.


Tomorrow

Decorators — modifying behavior at runtime


Rule

  • Don’t allocate unless needed
  • Prefer lazy over eager

See you in Day 9.