2025, Nov 19 23:00

Stop Chasing Function Inheritance in Python: Reuse Helpers via Classes or functools.partial Across Modules

Learn why Python has no function inheritance and how to reuse shared helpers across modules using classes or functools.partial. Avoid duplication by example.

When you split a Python library across several modules, it’s tempting to keep each file laser-focused on a single function and then expose a family of related helpers built around that function. The friction appears when you want those helpers to be defined once, yet automatically bound to the per-file function so users can call them as if they were native to each module. The idea sounds like “function inheritance”, plus partial application, rolled into one. In Python, that combination doesn’t happen implicitly, and understanding why will save you time and duplication.

Minimal setup that triggers the question

Imagine two modules, each holding one core function, and a separate utility that multiplies the result of a passed-in function by a factor:

# module_one.py

def add_pair(u, v):
    res = u + v
    return res
# module_two.py

def diff_pair(u, v):
    res = u - v
    return res
# utils.py

def scale_apply(fn, x, y, z):
    return fn(x, y) * z

The goal is to avoid repeating any glue code, while still enabling calls like module_one.scale_apply(x, y, z) and module_two.scale_apply(x, y, z), where the first argument to scale_apply is implicitly taken from the module’s own core function.

Why this feels harder than it should

The impulse is to mix two ideas: inheritance and partial function binding. Both exist, but they don’t integrate automatically. Inheritance in Python is a class-only mechanism. If you try to design this with functions alone, there’s no hierarchy to “inherit” from. With functions, you can partially bind arguments so you don’t repeat yourself, but you still have to perform that binding explicitly.

Two ways forward

The first route uses classes. You define a base with a generic method that delegates to a subclass-defined method. Each subclass supplies the per-file operation. This provides the inheritance you were aiming for, but via classes rather than functions:

class CoreBase:
    def mul_then(self, p, q, r):
        return self.core_op(p, q) * r

class AddOps(CoreBase):
    def core_op(self, p, q):
        return p + q

class SubOps(CoreBase):
    def core_op(self, p, q):
        return p - q

adder = AddOps()
subber = SubOps()

print(adder.mul_then(1, 2, 3))
print(subber.mul_then(1, 2, 3))

The second route stays purely functional. You keep the single implementation of the helper and create per-module callables by partially binding the first parameter to the module’s function. There’s no inheritance here, so nothing is automatic—you bind explicitly, once:

def sum_two(a, b):
    return a + b

def minus_two(a, b):
    return a - b

def apply_factor(g, a, b, c):
    return g(a, b) * c

from functools import partial

apply_with_sum = partial(apply_factor, sum_two)
apply_with_diff = partial(apply_factor, minus_two)

print(apply_with_sum(1, 2, 3))
print(apply_with_diff(1, 2, 3))

What to pick and why it matters

If you want polymorphism with a clean, extensible shape, classes make the delegation explicit and let you reuse the same orchestration method across variants. If you want a functional feel and you’re okay with binding once per module, partials keep you from duplicating logic while preserving a simple call style. The key is to avoid copying helper functions into each file, and both patterns deliver that: either by subclassing or by binding a single shared function.

Practical takeaways

There isn’t a built-in “function inheritance” mechanism in Python. If you need the behavior of inheritance, reach for classes and let a base method call a per-subclass implementation. If you prefer functions, use functools.partial to bind the module’s core function to a single shared helper. In both paths you centralize the behavior, minimize repetition, and keep the public surface consistent across files.