2025, Dec 06 07:00
Ship portable pybind11 C++ extensions on Linux: build manylinux wheels with auditwheel, avoid static glibc
Learn the right way to ship pybind11 C++ on Linux: build manylinux wheels, fix with auditwheel, avoid static glibc, and ensure compatibility across distros.
Shipping a fast C++ extension built with pybind11 is easy on Windows when you go fully static. On Linux, though, you will not get the same result by statically linking everything into a single .so. The Python packaging ecosystem has a standard way to deliver binary extensions that actually work across distributions and glibc versions, and it is different from “just link everything static.”
What you are really solving
The goal is to publish a binary Python extension that runs reliably across Debian, CentOS and other distributions over multiple years. On Linux, stability comes from targeting a defined glibc baseline and a controlled set of shared libraries, not from fully static linking of glibc. In Python terms, that means building a wheel that matches the platform compatibility rules defined by PEP 599 and PEP 600 and then letting pip resolve the right artifact via tags.
A minimal, correct build flow
The reference approach is to build inside the official manylinux images. They encode a glibc baseline and the set of allowed shared libraries for broad compatibility, and they already ship multiple CPython versions. A practical modern target is manylinux_2_28, based on AlmaLinux 8, which covers a wide range of contemporary distros.
Inside the container you compile wheels using the Pythons located under /opt/python. Then you validate and fix up the binary with auditwheel, which ensures compliance with PEP 599 and rewrites the wheel’s platform tag appropriately.
/opt/python/cp311-cp311/bin/pip wheel -w wheelhouse .
auditwheel repair --plat manylinux_2_28_x86_64 wheelhouse/*.whl
The outcome is a wheel tagged for manylinux_2_28 that pip can select via standard platform tags. If you must stretch back to older operating systems and your C++ code builds with the older compiler toolchain, consider the manylinux2014 images to span roughly the last decade.
Why a fully static .so is the wrong target
Static linking of glibc is strongly discouraged on Linux. It does not behave the way many expect, and it is a common source of subtle breakage. In contrast, linking against a reasonably old shared glibc version produces a .so that runs across a wide range of glibc-based systems. Everything else can be linked statically if you prefer, and you will mostly pay in file size. Be aware that making the C and C++ runtimes static is possible but can be non-trivial in practice.
If you want a truly static runtime for a separate native executable and you have the freedom to switch technologies, you can link against Musl via zig cc or build a Go binary, which defaults to static linking. That is a different deployment model than a Python extension, but it is a viable route when you control the language and runtime stack.
Compatibility, tags and what pip does with them
Wheels carry platform tags that encode the ABI and libc baseline. PEP 599 and PEP 600 describe which glibc versions to target and which libraries are allowed to be linked to achieve broad compatibility. pip uses these tags to pick an artifact matched to the environment, so your users on Debian or CentOS will get the correct wheel without manual intervention.
Older systems and practical build strategy
A simple, effective rule of thumb is to build on or for the oldest OS you want to support. The manylinux images exist precisely to standardize this baseline. If your compatibility requirements stretch far back and your code compiles with older toolchains, manylinux2014 is the fallback. For more surgical decisions, you can look up glibc versions for various distributions on DistroWatch and align your target.
An alternative for Python version coverage
If cutting down the number of builds matters more than adopting a specific ABI baseline, you can probably switch to nanobind and use Python's stable ABI to build for only one Python version. This is a separate trade-off and may or may not fit your constraints.
Why this matters
Following the manylinux and auditwheel path keeps you aligned with the packaging standards the wider Python ecosystem depends on. Your wheels will install cleanly, pip will know exactly when to pick them, and users on different distros will not chase missing system libraries. Most importantly, you avoid the pitfalls of static glibc and the unpredictable failures that come with it.
Putting it all together
Build wheels inside the manylinux_2_28 container using the Pythons under /opt/python for each ABI you need. Run auditwheel repair to validate that you only depend on allowed libraries and to stamp the correct platform tag. If you must support significantly older systems, repeat the process in manylinux2014. Keep third-party libraries static if you prefer, but do not statically link glibc. If you are unsure about distro coverage, check glibc baselines on DistroWatch and choose the appropriate manylinux image.
Conclusion
On Linux, the portable way to ship a pybind11 C++ extension is a manylinux-compliant wheel, not a fully static .so. Use the manylinux_2_28 toolchain, build with the CPython versions under /opt/python, and finish with auditwheel repair. Reach for manylinux2014 when you must cover older fleets. Avoid static glibc, lean on wheel tags for compatibility, and you will have a distribution pipeline that installs cleanly across Debian, CentOS and beyond.