The problem was setup.py. You couldn’t know a package’s dependencies without running its setup script. But you couldn’t run its setup script without installing its build dependencies. PEP 518 in 2016 called this out explicitly: “You can’t execute a setup.py file without knowing its dependencies, but currently there is no standard way to know what those dependencies are in an automated fashion without executing the setup.py file.”
This chicken-and-egg problem forced pip to download packages, execute untrusted code, fail, install missing build tools, and try again.
PEP 658 [putting package metadata directly in the Simple Repository API] went live on PyPI in May 2023. uv launched in February 2024. uv could be fast because the ecosystem finally had the infrastructure to support it. A tool like uv couldn’t have shipped in 2020. The standards weren’t there yet.
Wheel files are zip archives, and zip archives put their file listing at the end. uv tries PEP 658 metadata first, falls back to HTTP range requests for the zip central directory, then full wheel download, then building from source. Each step is slower and riskier. The design makes the fast path cover 99% of cases.
Some of uv’s speed comes from Rust. But not as much as you’d think.
pip copies packages into each virtual environment. uv keeps one copy globally and uses hardlinks (or copy-on-write on filesystems that support it).
uv parses TOML and wheel metadata natively, only spawning Python when it hits a setup.py-only package that has no other option.
Where Rust actually matters
uv uses rkyv to deserialize cached data without copying it. The data format is the in-memory format. This is a Rust-specific technique.
Rust’s ownership model makes concurrent access safe without locks. Python’s GIL makes this difficult. […]
uv is fast because of what it doesn’t do, not because of what language it’s written in. The standards work of PEP 518, 517, 621, and 658 made fast package management possible. Dropping eggs, pip.conf, and permissive parsing made it achievable. Rust makes it a bit faster still.
pip could implement parallel downloads, global caching, and metadata-only resolution tomorrow. It doesn’t, largely because backwards compatibility with fifteen years of edge cases takes precedence. But it means pip will always be slower than a tool that starts fresh with modern assumptions.