Functions & Callablescritical

Generators & yield

Functions that use yield to produce values lazily one at a time, suspending execution between calls.

Memory anchor

A generator is a lazy vending machine — it only makes the next snack when you press the button (next()). It remembers which slot it's on, and once it's empty, it's done forever. No refills.

Expected depth

Calling a generator function returns a generator object without executing the body. Execution runs up to the next yield on each next() call and suspends there. StopIteration is raised when the function returns. Generator expressions (x for x in ...) are the lazy equivalent of list comprehensions — constant memory regardless of dataset size.

Deep — senior internals

Generator state is stored in a frame object — local variables, the instruction pointer, and the stack are all preserved between yields. send(value) resumes execution AND passes a value back into the generator (yield becomes an expression that evaluates to the sent value). throw(exc) injects an exception at the yield point. yield from sub_gen delegates to a sub-generator and properly propagates send/throw/close. Generators are the foundation of Python coroutines — async def is syntactic sugar over generators + the event loop.

🎤Interview-ready answer

Generators produce values lazily — execution suspends at each yield and resumes on next(). This gives constant memory usage regardless of data size, which is critical for large files or streams. send() can pass values back in (making them coroutines). yield from delegates to a sub-generator. Generators are exhausted after one pass — create a new generator object to re-iterate.

Common trap

Generators are single-use. list(gen) exhausts it — a second list(gen) returns []. Also, generator expressions aren't evaluated until iterated — the iterable they reference is evaluated at creation, but the body runs lazily.

Related concepts