Functools Module
partial, lru_cache, reduce, wraps — Python’s toolkit for working with functions as first-class objects.
"functools is Python's toolbox for working with functions as objects — creating specialised versions, caching results, and reducing sequences to single values."
— ShurAIWhat is functools?
The functools module is part of Python’s standard library. It provides tools that treat functions as first-class objects — you can pre-fill their arguments, cache their results, or combine them:
from functools import partial, reduce, lru_cache, wraps
partial — Pre-fill Function Arguments
partial creates a new function with some arguments already filled in. Great for specialising generic functions:
from functools import partial
def power(base, exponent):
return base ** exponent
# Create specialised versions by pre-filling one argument
square = partial(power, exponent=2)
cube = partial(power, exponent=3)
print(square(5)) # 25
print(cube(4)) # 64
# Works great with built-ins too
from functools import partial
double = partial(int, base=2) # int() with base=2 → binary to int
print(double("1010")) # 10
print(double("1111")) # 15
lru_cache — Remember Expensive Results
lru_cache (Least Recently Used cache) stores a function’s return values so repeated calls with the same arguments return instantly, without re-running the function:
from functools import lru_cache
@lru_cache(maxsize=128) # cache up to 128 unique results
def fibonacci(n):
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
print(fibonacci(35)) # 9227465 — instant with cache
print(fibonacci(35)) # returned from cache, not recalculated
# See cache stats
print(fibonacci.cache_info())
# CacheInfo(hits=..., misses=36, maxsize=128, currsize=36)
lru_cache for fibonacci(10):Many results recalculated repeatedly
Remaining calls return instantly from cache
reduce — Collapse a Sequence to One Value
reduce(func, sequence) applies a function cumulatively to reduce a sequence to a single result:
from functools import reduce
nums = [1, 2, 3, 4, 5]
# Sum: 1+2 → 3+3 → 6+4 → 10+5 = 15
total = reduce(lambda a, b: a + b, nums)
print(total) # 15
# Product: 1*2 → 2*3 → 6*4 → 24*5 = 120
product = reduce(lambda a, b: a * b, nums)
print(product) # 120
# Find the maximum without using max()
biggest = reduce(lambda a, b: a if a > b else b, nums)
print(biggest) # 5
wraps — Fix Decorator Metadata
When you write a decorator, the wrapper replaces the original function’s name and docstring. @wraps fixes that:
from functools import wraps
def timer(func):
@wraps(func) # copies __name__ and __doc__ from func
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
@timer
def add(a, b):
"""Add two numbers."""
return a + b
print(add.__name__) # 'add' (not 'wrapper' without @wraps)
print(add.__doc__) # 'Add two numbers.'
Real Example — Cached API Fetcher
from functools import lru_cache, partial
@lru_cache(maxsize=64)
def get_country_info(country_code: str) -> dict:
"""Fetch country data. Result is cached after first call."""
import json
data = {
"IN": {"name": "India", "capital": "New Delhi"},
"JP": {"name": "Japan", "capital": "Tokyo"},
"DE": {"name": "Germany", "capital": "Berlin"},
}
return data.get(country_code, {})
# First call: fetches data
print(get_country_info("IN")) # {'name': 'India', 'capital': 'New Delhi'}
# Second call: returned instantly from cache
print(get_country_info("IN")) # same result, no re-fetch
print(get_country_info.cache_info()) # hits=1, misses=1
"lru_cache is one of the easiest wins in Python. Add two lines to a slow recursive or repeated function and it can go from seconds to microseconds."
— ShurAI🧠 Quiz — Q1
What does partial(power, exponent=2) create?
🧠 Quiz — Q2
When should you use @lru_cache?
🧠 Quiz — Q3
What does reduce(lambda a, b: a * b, [1, 2, 3, 4]) return?
🧠 Quiz — Q4
Why should you add @wraps(func) inside a decorator's wrapper?