🚀 Python Generators: Elegant, Efficient, and Often Underused
Python is full of hidden gems — and generators are one of its most powerful, elegant features. If you’ve ever worked with large datasets, built streaming pipelines, or wanted to write cleaner code without loading everything into memory, you might already be using them (or should be).
In this post, I’ll walk through:
Let’s dig in.
⚙️ Python Generators: Writing Code That Thinks Ahead
There’s a moment in every Python developer’s journey when the language surprises you — not with magic, but with something quieter and smarter. For me, that moment was discovering generators.
They didn’t just help me write cleaner code. They made me rethink how I approached problems: streams instead of snapshots, iteration instead of accumulation, and—eventually—even recursion without blowing the stack.
This post is a hands-on guide to Python generators — both iterative and recursive — and how they can reshape the way you think about data, control flow, and memory efficiency. Along the way, I’ll share lessons I learned building systems that needed to process too much data for memory and too quickly for comfort.
We'll cover:
🔍 What Is a Generator, Really?
At first glance, a generator in Python looks like a function — and in many ways, it is. But underneath, it’s something more powerful: a stateful iterator that can pause and resume its execution.
The key difference? Generators use yield instead of return.
Let’s start with an example:
def count_up_to(n):
count = 1
while count <= n:
yield count
count += 1
This function doesn’t return a final value. Instead, it yields one value at a time, pausing execution between each yield. When called, it doesn’t run like a normal function — it creates a generator object:
>>> counter = count_up_to(3)
>>> next(counter)
1
>>> next(counter)
2
>>> next(counter)
3
>>> next(counter)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
It’s like a function that remembers where it left off. Each call to next() resumes execution right after the last yield.
Why It Matters
This ability to pause execution mid-function gives you incredible control over memory and flow — especially when working with large datasets, streams, or recursive structures like trees and graphs.
Think of it this way:
Under the Hood
When Python sees a function with a yield, it compiles it into a generator function. Calling it returns a generator object that implements the iterator protocol (__iter__ and __next__).
This object holds:
You can loop over it like any iterable:
for number in count_up_to(3):
print(number)
No list needed. No memory overhead from building a full array. Just one value at a time.
More Than Just a Counter
Let’s look at two more generator functions — ones that do real work and demonstrate the benefit of streaming values instead of stockpiling them.
Factorials
def factorials_upto(n):
factorial = 1
for i in range(1, n + 1):
factorial *= i
yield factorial
This function yields the factorials of numbers 1 through n, one at a time.
Compare this with a non-generator version:
def factorials_upto_list(n):
factorial = 1
result = []
for i in range(1, n + 1):
factorial *= i
result.append(factorial)
return result
Both get the job done — but the list-based version builds and returns the entire result set at once, taking up memory proportional to n. The generator version, on the other hand, can be paused, resumed, and stopped early:
for f in factorials_upto(1000):
if f > 100000:
break
print(f)
You can't do that with the list-based version without computing all 1000 values up front.
Fibonacci Numbers
def fibonacci_upto(n):
yield 1
if n == 1:
return
yield 1
if n == 2:
return
prev2, prev = 1, 1
for _ in range(3, n + 1):
next_ = prev2 + prev
yield next_
prev2, prev = prev, next_
* Footnote: In Python if one wishes to use a keyword as a variable name one appends underscore onto the end, hence next_
This generator produces the first n Fibonacci numbers lazily. It starts yielding immediately and never allocates a full list in memory. That makes it ideal for working with potentially large or even infinite streams — or when you want to control when and how much to compute.
Why This Matters
With a traditional function that returns a list, you compute everything before handing back the result. That’s fine for small inputs — but wasteful, even dangerous, for large ones.
Generators let you:
Put simply: generators defer work until it’s needed. And sometimes, that makes all the difference.
The Mechanics of a Generator
You’ve seen several examples of generators already, and this section introduces a few more to explain how they actually work.
A generator function looks like a normal function but contains one or more yield expressions. When you call a generator function, it doesn’t run the code immediately—instead, it returns a generator object, which you can then iterate over.
Invoking a generator returns a generator object which is a stateful iterator. That stateful iterator yields one value per iteration. You don’t have to consume every value—generators can be paused, resumed, or abandoned at any time. In fact, some generators yield values indefinitely, producing infinite sequences.
Let’s look at a generator that never ends – it produces values indefinitely and only stops if you stop asking for them.
def fibonacci_sequence():
yield 1
yield 1
prev2 = 1
prev = 1
while True:
next_ = prev2 + prev
yield next_
prev2 = prev
prev = next_
fib1 = fibonacci_sequence()
for _ in range(20):
print(next(fib1))
This outputs
1
1
2
3
5
8
13
21
34
55
89
144
233
377
610
987
1597
2584
4181
6765
Each call to fibonacci_sequence() returns a new generator object with its own internal state. If you assign fib1 = fibonacci_sequence(), then each call to next(fib1) yields the next number in the Fibonacci sequence. Creating a second generator, like fib2 = fibonacci_sequence(), will start from the beginning again—independent of the state of fib1.
What about generators that yield a finite number of values? Here is another example of one of those:
def divisors(n: int):
for i in range(1, n + 1):
if n % i == 0:
yield i
A divisor is an integer that divides another integer evenly – with no remainder. For example the divisors of 12 are: 1, 2, 3, 4, 6, and 12. If you invoked div_12 = divisors(12), you would get a generator object that would yield each divisor of 12, one at a time – 1, 2, 3, 4, 6, 12:
for num in divisors(12):
print(num)
This outputs:
1
2
3
4
6
12
One common way to iterate over a generator is with a for loop. In each iteration, the loop variable (also called the iteration variable) takes on the value yielded by the generator, one at a time.
Another way to iterate over the values from a generator object is by using next(). The syntax is:
value = next(iterator[, default])
If you invoke next() without providing a default, the StopIteration exception is raised when the generator is exhausted—i.e., when there are no more values to yield. You can catch this exception using a try...except block if you want to handle the end of the sequence explicitly.
When iterating over the items yielded by a generator using a for loop, you never see the StopIteration exception because Python handles it automatically – once the generator is exhausted, the loop ends.
If you provide a default value as the second argument to next(), that fallback will be returned instead of raising an exception once the generator is exhausted. This is useful when you want to supply a sentinel value like None, "DONE", or any fallback result when iteration is complete.
Here is an example of the StopIteration exception being raised when next is called after the last value is yielded:
>>> div_12 = divisors(12)
>>> next(div_12)
1
>>> next(div_12)
2
>>> next(div_12)
3
>>> next(div_12)
4
>>> next(div_12)
6
>>> next(div_12)
12
>>> next(div_12)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
Here is an example invoking next with a default value:
>>> div_12 = divisors(12)
>>> next(div_12, 'DONE')
1
>>> next(div_12, 'DONE')
2
>>> next(div_12, 'DONE')
3
>>> next(div_12, 'DONE')
4
>>> next(div_12, 'DONE')
6
>>> next(div_12, 'DONE')
12
>>> next(div_12, 'DONE')
'DONE'
In the above example 'DONE' works as a sentinel value, but we could have used Python’s None value instead.
Here’s an example of explicitly catching the StopIteration exception when using next() to iterate through a generator. While this isn’t the most common pattern in everyday Python (since for loops handle it for you), it’s useful to understand how this works under the hood:
def rainbow():
colors = [
'scarlet', 'crimson', 'carmine',
'vermillion', 'coral', 'amber',
'goldenrod', 'canary', 'lemon',
'chartreuse', 'forest', 'moss',
'teal', 'azure', 'cobalt',
'indigo', 'midnight', 'prussian',
'amethyst', 'orchid', 'violet'
]
for color in colors:
yield color
rainbow_iter = rainbow()
while True:
try:
color = next(rainbow_iter)
print(color)
except StopIteration:
print('<==================================>')
print('Caught the exception - StopIteration')
break
This outputs:
scarlet
crimson
carmine
vermillion
coral
amber
goldenrod
canary
lemon
chartreuse
forest
moss
teal
azure
cobalt
indigo
midnight
prussian
amethyst
orchid
violet
<==================================>
Caught the exception - StopIteration
✅ Why This Works:
In summary, generators give you precise control over iteration, allowing you to pause and resume computation. In the next section, we’ll explore how generators can behave like coroutines by accepting input as well as yielding output.
Using .send to make a generator behave like a coroutine
So far, all our generators have only yielded values outward using yield. This gives us a one-way flow: the generator pushes data to the caller. We iterate over the values yielded from the generator object using for or next. This gives us a one-way data flow: the generator pushes values out.
However, Python also allows values to flow in the opposite direction – into the generator – using the .send() method. Here is an example:
def evens(n=2):
if n % 2 == 1:
n += 1
while True:
resume = yield n
if resume is None:
n += 2
else:
if resume % 2 == 1:
resume += 1
n = resume
The above generator evens has a yield expression. It yields the current value of n and assigns what is passed back into the generator object (instance) to resume. The first time you advance the generator, you must use either next() or .send(None). After that, you can use either next() or .send(value). If the iteration is made using next, resume will be assigned None, if it is made using .send resume will be assigned the argument of send. Here is an example:
>>> even_iter = evens()
>>> next(even_iter)
2
>>> next(even_iter)
4
>>> next(even_iter)
6
>>> next(even_iter)
8
>>> next(even_iter)
10
>>> next(even_iter)
12
>>> next(even_iter)
14
>>> next(even_iter)
16
>>> next(even_iter)
18
>>> next(even_iter)
20
>>> next(even_iter)
22
>>> next(even_iter)
24
>>> next(even_iter)
26
>>> next(even_iter)
28
>>> next(even_iter)
30
>>> next(even_iter)
32
>>> next(even_iter)
34
>>> next(even_iter)
36
>>> next(even_iter)
38
>>> next(even_iter)
40
>>> even_iter.send(24)
24
>>> next(even_iter)
26
>>> next(even_iter)
28
>>> next(even_iter)
30
>>> next(even_iter)
32
>>> even_iter.send(4)
4
>>> next(even_iter)
6
>>> next(even_iter)
8
>>> next(even_iter)
10
>>> next(even_iter)
12
When a generator supports the use of send to change its internal state we say that it is behaving like a coroutine.
A coroutine is a special type of function that can pause its execution, yield control back to the caller, and later resume where it left off—all while maintaining its internal state. All generators in Python pause and resume execution while preserving their state. But when a generator also accepts values sent into it – via .send() – we say it is behaving like a coroutine.
Type Hints and Generators
Python supports type hints for generator functions, allowing you to indicate the types of values the generator yields, receives, and returns. This can improve code clarity and help tools like linters or IDEs provide better assistance.
The most common way to type a generator is by using the Generator type from the typing module:
from typing import Generator
def count_up_to(n: int) -> Generator[int, None, None]:
count = 1
while count <= n:
yield count
count += 1
The type signature Generator[int, None, None] breaks down as follows:
Here’s another example using .send():
from typing import Generator
def echo() -> Generator[str, str, None]:
while True:
received = yield "Ready"
print(f"Got: {received}")
This generator:
Simplified Forms: Iterator and Iterable
If your generator doesn't use .send() or return, and you're only interested in the values it yields, you can simplify the type hint using Iterator or Iterable:
from typing import Iterator
def squares(n: int) -> Iterator[int]:
for i in range(n):
yield i * i
Use Iterator when you're consuming values one-by-one (e.g. via next()), and Iterable when passing the generator to something like a for loop or list().
Python 3.9+: Built-in Generics
If you're using Python 3.9 or later, you can skip typing.Iterator and use the built-in generic types directly:
def squares(n: int) -> iter[int]:
for i in range(n):
yield i * i
But be aware that support for built-in generics like iter[int] is still catching up across some tooling, so Iterator[int] from typing may still be the safer choice for now.
🧠 2. When and Why to Use Generators
(With practical examples and performance notes)
So, when should you reach for a generator instead of building and returning a full list?
Here’s the rule of thumb I use:
If your function produces a sequence and you don't need all of it at once, use a generator.
Generators are ideal when:
Let’s look at some real-world scenarios where generators shine.
🧭 Case Study 1: Maze Solving with Lazy Neighbors
Let’s walk through a real-world maze-solving problem — LeetCode #1926: Nearest Exit from Entrance in Maze.
Problem summary: Given a maze represented by a grid of walls ('+') and open paths ('.'), and a starting point inside the maze (the entrance), find the shortest number of steps to the nearest exit (any open path on the border of the maze, not equal to the entrance). Return -1 if no such path exists.
This is a variation of the classic shortest path problem and can be solved using Breadth-First Search (BFS). The only twist is how we handle neighbor discovery — and this is where generators shine.
The Core Algorithm
Let’s look at a the BFS implementation:
from collections import deque
from typing import List, Tuple, Generator, Set
def nearestExit(maze: List[List[str]], entrance: List[int]) -> Tuple[int, List[Tuple[int, int]]]:
rows = len(maze)
columns = len(maze[0])
entrance = tuple(entrance)
queue = deque()
visited = set()
parent = {} # child -> parent
queue.append((entrance, 0))
visited.add(entrance)
while queue:
location, distance = queue.popleft()
for neighbor in neighbors(location, entrance, visited, maze, rows, columns):
parent[neighbor] = location
if isExit(neighbor, entrance, rows, columns):
return distance + 1, reconstructPath(neighbor, entrance, parent)
queue.append((neighbor, distance + 1))
return -1, []
def isExit(
location: Tuple[int, int],
entrance: Tuple[int, int],
rows: int,
columns: int
) -> bool:
if location == entrance:
return False
row, col = location
if row == 0 or col == 0 or row == rows - 1 or col == columns - 1:
return True
else:
return False
def reconstructPath(exit_location: Tuple[int, int], entrance: Tuple[int, int], parent: dict) -> List[Tuple[int, int]]:
path = [exit_location]
while path[-1] != entrance:
path.append(parent[path[-1]])
path.reverse()
return path
We track visited nodes and parents (to reconstruct the path), and we call neighbors(...) to explore valid adjacent tiles.
But how that neighbors(...) function is written has a huge impact.
🧱 The List-Based Version
Here’s one way to implement neighbors:
from typing import List, Tuple, Set
def neighbors(
location: Tuple[int, int],
entrance: Tuple[int, int],
visited: Set[Tuple[int, int]],
maze: List[List[str]],
rows: int,
columns: int
) -> list[tuple[int, int]]:
row, col = location
neighbor_list = []
if col > 0 and maze[row][col - 1] == '.':
# MOVE LEFT
neighbor = (row, col - 1)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
neighbor_list.append(neighbor)
if row > 0 and maze[row - 1][col] == '.':
# MOVE UP
neighbor = (row - 1, col)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
neighbor_list.append(neighbor)
if col < columns - 1 and maze[row][col + 1] == '.':
# MOVE RIGHT
neighbor = (row, col + 1)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
neighbor_list.append(neighbor)
if row < rows - 1 and maze[row + 1][col] == '.':
# MOVE DOWN
neighbor = (row + 1, col)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
neighbor_list.append(neighbor)
return neighbor_list
This version fully evaluates all four directions and stores all valid neighbors in a list before returning it.
Even if the very first neighbor leads directly to an exit, we still check the rest, add them to a list, and return that list.
⚙️ The Generator Version
With just a small change, we can convert neighbors() into a generator and yield neighbors one by one, lazily:
from typing import List, Tuple, Generator, Set
def neighbors(
location: Tuple[int, int],
entrance: Tuple[int, int],
visited: Set[Tuple[int, int]],
maze: List[List[str]],
rows: int,
columns: int
) -> Generator[Tuple[int, int], None, None]:
row, col = location
if col > 0 and maze[row][col - 1] == '.':
# MOVE LEFT
neighbor = (row, col - 1)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
yield neighbor
if row > 0 and maze[row - 1][col] == '.':
# MOVE UP
neighbor = (row - 1, col)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
yield neighbor
if col < columns - 1 and maze[row][col + 1] == '.':
# MOVE RIGHT
neighbor = (row, col + 1)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
yield neighbor
if row < rows - 1 and maze[row + 1][col] == '.':
# MOVE DOWN
neighbor = (row + 1, col)
if neighbor != entrance and neighbor not in visited:
visited.add(neighbor)
yield neighbor
This version:
✅ Why the Generator Wins
Let’s say you’re two steps away from the nearest exit:
Benefits of the generator approach:
This kind of micro-optimization can scale really well in tight loops like pathfinding — and it doesn’t cost much to implement. That's the power of generators in action.
🌳 Case Study 2: In-Order Traversal of a Binary Tree
In this case study, we’ll look at in-order traversal of a binary tree — a foundational pattern in tree algorithms. In a binary tree, each node has at most two children -- left and/or right, and in-order traversal means visiting the left subtree, then the current node, then the right subtree. If you need a refresher on binary trees or traversal orders, check out this explanation or this visual walkthrough.
1. Recursive Generator (cleanest, most Pythonic)
def inOrderTraversal(root):
if not root:
return
if root.left:
yield from inOrderTraversal(root.left)
yield root.val
if root.right:
yield from inOrderTraversal(root.right)
This version is concise, readable, and doesn't build a list. It streams each value as needed — perfect for pipelines, filtering, or early termination.
2. Iterative Generator (no recursion, still lazy)
def inOrderTraversal(root):
if not root:
return
node = root
stack = [node]
while node.left:
stack.append(node.left)
node = node.left
while stack:
node = stack.pop()
yield node.val
if node.right:
node = node.right
stack.append(node)
while node.left:
stack.append(node.left)
node = node.left
Same behavior, but avoids recursion (useful for very deep trees). And it still yields values one at a time.
3. Recursive List (eager + recursive)
def inOrderTraversal(root):
if not root:
return []
traversal = []
if root.left:
traversal += inOrderTraversal(root.left)
traversal += [root.val]
if root.right:
traversal += inOrderTraversal(root.right)
return traversal
Works fine for small trees, but builds up a list even if you only care about the first few elements.
4. Iterative List (eager + looped)
def inOrderTraversal(root):
if not root:
return []
traversal = []
node = root
stack = [node]
while node.left:
stack.append(node.left)
node = node.left
while stack:
node = stack.pop()
traversal.append(node.val)
if node.right:
node = node.right
stack.append(node)
while node.left:
stack.append(node.left)
node = node.left
return traversal
A valid option, but memory-hungry and harder to plug into pipelines.
🧵 Generators Let You Compose
Another reason to use generators is that they fit naturally into composable data flows — you can yield, filter, and map without building up intermediates.
For example:
evens = (x for x in inOrderTraversal(root) if x % 2 == 0)
No need to collect the full traversal just to get the even numbers — you process as you go, and stop when you have enough.
🧠 Summary
Use a generator when:
Use a list-returning function when:
🔁 3. Iterative Generators
Diving deeper into patterns and real-world techniques
Once you understand the basics of generators, the next step is learning to think iteratively — structuring your code around what happens next, rather than building everything up front.
Generators let you write producer-style logic, where each value is yielded in sequence, and the consumer decides how much to consume.
Let’s walk through patterns and techniques that will make you a better generator-writer — and a more efficient problem solver.
🔄 Pattern 1: Rolling State
This is one of the simplest and most useful generator patterns: maintaining internal state and yielding it over time.
Example: Factorials
def factorials_upto(n):
factorial = 1
for i in range(1, n + 1):
factorial *= i
yield factorial
Each call to next() gives the next factorial — no lists, no recomputation, and no wasted memory.
🔁 Pattern 2: On-the-Fly Calculation with Early Exit
If you only need some of the data, generators let you get just that — and no more.
for f in factorials_upto(1000):
if f > 100_000:
break
print(f)
With a list-based function, you’d compute and store all 1,000 factorials. With a generator, computation stops as soon as you’re done.
This is one of the cleanest forms of lazy evaluation: defer work until (and unless) it’s needed.
➿ Pattern 3: Streaming Recursive Structures
When you use yield from inside a recursive function, you unlock an elegant way to traverse trees, graphs, or nested data.
Example: In-Order Tree Traversal (recursive)
def inOrderTraversal(root):
if not root:
return
if root.left:
yield from inOrderTraversal(root.left)
yield root.val
if root.right:
yield from inOrderTraversal(root.right)
This is a classic generator use case. It’s clean, avoids explicit stack management, and gives you one value at a time — ideal for filtering, mapping, or early termination.
📥 Pattern 4: Iterative with Stack
Generators also work well with iterative control flow. You can mimic recursive traversal patterns without blowing the stack.
Example: In-Order Tree Traversal (iterative)
def inOrderTraversal(root):
if not root:
return
node = root
stack = [node]
while node.left:
stack.append(node.left)
node = node.left
while stack:
node = stack.pop()
yield node.val
if node.right:
node = node.right
stack.append(node)
while node.left:
stack.append(node.left)
node = node.left
You still yield values one at a time, but this version is safe for very deep trees that would otherwise cause recursion depth errors.
🔃 Pattern 5: Generator Pipelines
You can combine multiple generators using generator expressions and yield from to create lightweight, composable pipelines:
def evens_only(iterable):
for x in iterable:
if x % 2 == 0:
yield x
def squared(iterable):
for x in iterable:
yield x * x
for val in squared(evens_only(factorials_upto(10))):
print(val)
Each stage is cleanly separated, memory-efficient, and easy to test. No intermediate lists required. This kind of composability is part of what makes generator-based architecture so scalable and elegant.
🧵 Bonus Tip: Wrap Iteration in Functions
To avoid deeply nested generator expressions, it’s often cleaner to encapsulate logic in named generator functions like above. You gain readability, modularity, and reuse — all without losing laziness.
🧠 Recap: What Makes Iterative Generators Powerful?
In the next section, we’ll zoom in on recursive generators: where things get even more powerful — and a little trickier.
🌲 4. Recursive Generators
Generator-based tree/graph traversal and backtracking
Recursive generators are one of those Python features that feel a little like magic — and a lot like clarity.
They let you write elegant, readable recursive logic while preserving the memory efficiency and control of generators. You can stream values from deep structures like trees, graphs, or nested lists without building full results in memory.
Let’s walk through some classic recursive patterns that become cleaner with yield and yield from.
🌿 Tree Traversal with yield from
We’ve already seen how you can do in-order traversal of a binary tree using recursion. But the beauty of the generator-based version is that you don’t need to think about the traversal list at all.
In-Order Traversal (Recursive Generator)
def inOrderTraversal(root):
if not root:
return
if root.left:
yield from inOrderTraversal(root.left)
yield root.val
if root.right:
yield from inOrderTraversal(root.right)
Compare this to the list-building version:
def inOrderTraversal(root):
if not root:
return []
return (
inOrderTraversal(root.left)
+ [root.val]
+ inOrderTraversal(root.right)
)
This looks clean — but builds a full list, even if the caller only wants the first element.
With the generator, the caller can short-circuit, filter, or transform on the fly.
🧭 Backtracking with Yield
Now let’s look at a backtracking problem — finding all paths from a start node to an end node in a graph. Instead of accumulating paths in a result list, we’ll yield them as we go.
def all_paths(graph, start, end, path=None):
if path is None:
path = [start]
else:
path = path + [start]
if start == end:
yield path
return
for neighbor in graph.get(start, []):
if neighbor not in path: # avoid cycles
yield from all_paths(graph, neighbor, end, path)
This function lazily explores all paths, yielding one at a time. You can stop early if you find what you're looking for — no need to wait for the full result set to be built.
# Example graph
graph = {
'A': ['B', 'C'],
'B': ['C', 'D'],
'C': ['D'],
'D': []
}
for path in all_paths(graph, 'A', 'D'):
print(path)
Why this matters: With a list-returning version, you'd generate all paths before returning. With a generator, you can:
♻️ Recursive Composability
Generators also let you write recursive functions that are composable — one generator can yield from another without needing to know anything about its structure.
This makes it easy to build modular logic for:
⚖️ Trade-Offs
Recursive generators are powerful, but there are a few things to be mindful of:
Still, for problems that fit the search-and-yield pattern, recursive generators are often the clearest and most memory-friendly solution.
⚖️ 5. Alternatives and Tradeoffs
Lists, iterators, coroutines, and async — when not to use a generator
Generators are powerful — but they’re not always the best tool for the job. Sometimes, you want all your data up front. Other times, you need more control over concurrency or data flow than a generator provides.
Let’s walk through the main alternatives and their tradeoffs.
🧺 Lists: Eager, Simple, Repeatable
Sometimes you just want a list.
def get_numbers():
return [1, 2, 3, 4, 5]
Lists are:
Use a list-returning function when:
Tradeoff: You pay up front in both memory and computation. For large data or early-exit scenarios, this can be wasteful.
🔁 Iterators: Lightweight, but Limited
An iterator is any object that implements __iter()__ and __next()__ — including generators, but also things like file handles, range(), and custom classes.
class CountUpTo:
def __init__(self, n):
self.current = 1
self.n = n
def __iter__(self):
return self
def __next__(self):
if self.current > self.n:
raise StopIteration
val = self.current
self.current += 1
return val
Custom iterators are useful when you need:
But for most use cases, a generator is simpler, shorter, and easier to maintain.
🧶 Coroutines (yield with send())
Python generators can do more than yield — they can receive data too.
def echo():
while True:
received = yield
print(f"Got: {received}")
When we create a generator that contains a yield assignment expression like above we say the generator is acting like a coroutines, and they support send() instead of just next(). This pattern is rare but powerful — often used in:
Tradeoff: It’s more complex, less readable, and has a steeper learning curve. Most devs won’t need this unless they’re building a message-passing or stream-processing system.
🕸️ Async Generators: I/O-Friendly, Concurrent
With async def and async for, you can write asynchronous generators that yield values while awaiting I/O:
async def fetch_pages(urls):
for url in urls:
page = await fetch(url)
yield page
This is the tool to reach for when:
Tradeoff: Async requires more scaffolding (asyncio, event loops) and doesn’t mix easily with synchronous code. It’s the right choice for large-scale async workloads, but not for most data-processing loops.
🧠 Summary
Generator
List
Iterator Class
A Generator that Behaves Like a Coroutine
Async generator
In short:
Generators are your go-to when you want streamed data, composable logic, or resource-aware processing. But if you need random access, eager computation, or concurrency — reach for something else.
🧠 6. Generators and Functional Thinking
Lazy evaluation, composability, and functional elegance in Python
Even if you don’t consider yourself a functional programmer, Python's generators encourage you to think in ways that are deeply functional — in the best sense of the word.
🔍 What is functional programming? At its core, functional programming is about writing code that’s declarative, composable, and side-effect-free. It favors small, pure functions — functions that take inputs, return outputs, and don’t rely on or modify shared state.
In Python, you've seen hints of this style with map(), filter(), and list comprehensions. Generators extend this philosophy: they let you express lazy, stream-like flows of data without managing memory or side effects directly.
Generators support:
Let’s look at what this looks like in practice.
🔁 Chaining Operations
In functional programming, you often chain transformations — mapping, filtering, reducing. You can do this with Python generators in a cleaner, more readable way than deeply nesting map()/filter() calls.
def evens_only(iterable):
for x in iterable:
if x % 2 == 0:
yield x
def squared(iterable):
for x in iterable:
yield x * x
for val in squared(evens_only(range(10))):
print(val)
Each piece is:
You can plug in new steps without changing the surrounding structure. That’s functional composition in practice.
📦 Generator Expressions as Functional Primitives
Python’s generator expressions are deeply aligned with functional ideas:
squares = (x * x for x in range(1_000_000))
No list is created. No memory is wasted. Just a stream of values that can be consumed one by one.
You can slice it, map over it, or pass it to sum() or max() — and Python won’t compute a single value until asked.
🔬 Thinking Declaratively
Functional code is often declarative: it describes what should happen, not how to manually do it. Generators support this mindset beautifully.
def fibonacci(n):
a, b = 0, 1
for _ in range(n):
yield b
a, b = b, a + b
This doesn’t collect values or manage lists — it just declares the process. The consumer controls how many values to take, when, and how to use them.
🔄 Stateless, Side-Effect-Free Iteration
Generators yield values without mutating external state or holding onto large data structures. This makes them inherently side-effect-free and predictable — ideal traits for clean, testable code.
Even in recursion-heavy problems (like traversing trees or solving mazes), generator-based solutions are often easier to reason about and safer to run.
🧩 Where This Leads
Thinking in generators helps you write code that is:
Even if you never dive deep into Haskell or Clojure, writing “generator-minded” Python gives you many of the same benefits.
Up next: let’s wrap things up with a few closing thoughts on how to make this mindset part of your everyday coding.
✨ 7. Closing Thoughts
How to write more “generator-minded” code
Generators aren’t just a Python feature — they’re a way of thinking. Once you’ve worked with them a bit, you’ll start to see opportunities for them everywhere.
Writing “generator-minded” code means thinking about flows instead of collections, process instead of result, and efficiency without complexity. It’s a subtle shift, but one with real impact.
Ask yourself:
Often, the answer is yes.
💡 Generator Thinking Leads to Better Design
Even when you don’t use yield, a generator mindset helps you:
Generators teach you to trust the consumer of your code — to deliver value one piece at a time, just when it’s needed. That’s not just efficient — it’s considerate software design.
🎯 One Final Thought
Python’s yield is a small keyword, but it teaches a big lesson:
You don’t have to do everything at once. You just have to do the next thing well.
That’s true of good code — and good systems thinking in general.