uv for fast wheels
11 Aug 2025To build or not to build?
You have a Python package. It solves a technical or business problem. You have decided to share it with the open-source community. Okay, just set up GitHub Actions and publish it to the PyPI. Periodically, you carve a new version and publish it. Everybody is happy – developers, consumers, and the community.
Then you suddenly discover that Python is slow. It is perfectly fine for many use cases, but yours is doing CPU-intensive work. You think it’s worth shifting CPU-intensive work from the interpreted language to a native extension written in C, C++ or Rust. There are a few different approaches to do this – Cython, CFFI, pure low-level CPython extensions. CFFI is the most widely used. You need to write native code in C, build it somehow into a library, load the library from Python and call it while adhering to call conventions and ownership semantics.
The problem is how to build and package a binary distribution. Different projects go in different directions. Here lies my experience of trying and using them.
Poetry
Poetry prefers having explicit build scripts, and the Poetry build backend is aware of them.
[build-system]
requires = ["poetry-core", "setuptools"]
build-backend = "poetry.core.masonry.api"
[tool.poetry.build]
script = "scripts/build.py"
The build script is a simple Python module invoking setuptools.command.build_ext
and cffi
or cython
. The downsides of this approach – it is still unofficial and unstable, and the behaviour changed with Poetry 2.
Setuptools
With old, good setuptools, it is easy to use setuptools’ official and supported integrations with ext_modules
using cythonize
or ffibuilder
using the following setup.py
:
setup(
setup_requires=["cffi>=1.0.0"],
cffi_modules=["scripts/build.py:ffibuilder"],
install_requires=["cffi>=1.0.0"],
)
Worked like a charm for decades, but now it’s outdated. Even with dependency control via pip freeze
you will run into many well-known Python packaging problems.
uv
What if you want to combine a modern packaging tool like uv
and build binary distributions?
uv
uses pyproject.toml
for project configuration. The main benefit of the pyproject format is the standardisation of storing metadata for a project, build backend and external tools. It’s a big step forward from the imperative setup.py
. It allows not only storing metadata, but also simplifying tools and significantly improving performance, because dependency resolution no longer requires evaluating Python code in setup.py
.
Of course, all the dependencies – build or runtime – can be specified in the same file. uv
has a great build backend uv_build
(used by default) – extremely fast (because, you know, it’s in Rust :zap:) and reliable. It fully supports the pyproject.toml
standard, including the standard project
tag.
Binary distributions and uv
Unfortunately, this uv_build
backend only supports pure Python. For binary extensions, the easiest way to overcome this limitation is to use a setuptools
build backend instead.
[build-system]
requires = ["setuptools>=61", "cffi"]
build-backend = "setuptools.build_meta"
setuptools
itself perfectly supports multiple formats for configuration – TOML.
[project]
dependencies = [ "numpy", "h3" ]
or setup.py
:
setup(install_requires=[ "numpy", "h3" ])
But that’s not for all tools. All the integrated tools should decide for themselves where to get data from.
For example, CFFI doesn’t support declarative pyproject.toml
as a data source for cffi_module
, which is sad because you have to store a setup.py
file, albeit simple:
from setuptools import setup
setup(cffi_modules=["scripts/build.py:ffibuilder"])
Okay, now we’re back to setup.py
. Poetry 1.x was able to generate setup.py
with build instructions under the hood, so this approach is more or less familiar.
The most challenging part is to configure package discovery and file packaging properly. Here’s how we did it in the open-source timezonefinder package with @jannikm. It has a flat layout. Having a flat layout can bring unnecessary files into the package compared to the src
layout, and that can be improved with the where = ["src"]
directive. If you have a flat-layout, but still follow standard naming conventions, then automatic package discovery could work without any additional tweaks. It excludes typical directories like tests
, examples
, etc. To include new files, like built binaries in a module timezonefinder/inside_poly_extension
, you have to include them in the packages
directive explicitly:
[tool.setuptools]
packages = ["tests", "timezonefinder", "timezonefinder.inside_poly_extension"]
[tool.setuptools.package-data]
timezonefinder = ["**/*.json", "**/*.c", "**/*.h", ]
Since automatic package discovery is used, the build.py
script cannot be stored at the root of the repository (it is in the exclusion list). Placing the script in the timezonefinder
package eliminates this problem. Also, in this example, the tests are packaged too, because we need them in the distribution. The build script itself is simple and compiles a dynamic library from sources.
Essentially, the whole process of building the package is either
uv build --sdist
or
uv build --wheel
uv and CI
uv also simplifies CI setup for many projects.
If a project is pure Python, nothing special is required. Run tests, run a linter, make an sdist and upload it to PyPI. uv
can streamline commands and simplify tool installation.
Binary packaging complicates the CI build. Now the project has multiple binary targets. Possible dimensions are cp39
,cp310
, cp311
for Python versions; musllinux
or manylinux
for ABI targets. There are two approaches to this.
First, the cibuildwheel tool. It manages the whole process automatically with a single GitHub Actions step by creating Docker containers for all permutations of specified Python versions and architectures. Inside each container, it runs a specified command (setuptools by default) and exports the produced wheels.
- name: "Build wheels"
uses: pypa/[email protected]
with:
output-dir: dist
env:
CIBW_MANYLINUX_X86_64_IMAGE: quay.io/pypa/manylinux2014_x86_64:2025.03.09-1
CIBW_MUSLLINUX_X86_64_IMAGE: quay.io/pypa/musllinux_1_1_x86_64:2024.10.26-1
CIBW_BUILD: "cp39-* cp310-* cp311-* cp312-* cp313-*"
CIBW_BUILD_FRONTEND: pip
CIBW_BEFORE_ALL_LINUX_MANYLINUX2014: yum install -y libffi-devel clang make
CIBW_BEFORE_ALL_LINUX_MUSLLINUX_1_1: apk add --no-cache libffi-dev clang make
The second approach is to utilise a matrix build. uv
can manage Python versions by itself by downloading prebuilt CPython binaries and executing scripts in their context. So the GitHub Actions build becomes even simpler:
strategy:
matrix:
python-version:
- "cp310"
- "cp311"
- "cp312"
steps:
- uses: actions/checkout@v4
- name: Make wheel
run: uv build --python $ --wheel
Of course, if you want to achieve manylinux
or abi3
compatibility, you’ll need to use cibuildwheel
in combination with uv
.
You could also use the GitHub action astral-sh/setup-uv@v6
, but the example cited above is the shortest version to demonstrate the flow.
Conclusion
uv
provides an excellent experience when working with modern Python packaging. Lightning-fast dependency resolution and tool ergonomics are the main benefits.
Binary distribution is not a first-class citizen in uv
and requires a fallback to setuptools
build backend, while still using uv
as a frontend and keeping its ergonomics.
The problem of standardised binary distribution building is not yet solved by any tool. Approaching this could be the next big step for the Python ecosystem, especially when Python is increasingly accelerated with native code.