1
 
 
Account
In your account you can view the status of your application, save incomplete applications and view current news and events
March 26, 2025

How UV Solves The Hard Problem of Python’s Package Management

What is the article about?

Python has become the de facto standard for machine learning, largely because of its easily accessible programming interface. Paradoxically, when it comes to packaging and shipping of the final code itself, Python’s package management becomes anything but simple. While multiple approaches have been tried to simplify this, our experience with the recent UV package manager is more than promising. This article showcases why to pick UV for organizing dependencies of your Python project.

Why is Python's package management difficult?

The famous xkcd: Standards is a strikingly accurate description of the state of Python packaging. While the credo of the Python language itself is "there should be one obvious way to do it", nothing is further from this than the myriad of ways to organise the project layout of Python software.

Here are a few examples to illustrate:

  • There is a flat layout and src layout
  • You can organize code as a Python package or as pile of scripts. 
  • Package definition can go into a setup.py or a new pyproject.toml.
  • In addition, the Python Packaging Authority (PyPa) does not enforce a single standard for packaging. Instead, it is a democratic standard with multiple build backend and frontend implementations – hatch, setuptools, pdm, to name a few.
  • PyPa also does not validate that packages uploaded to its public repository actually install what their metadata claims they would install as transitive dependencies.

Because of the historically introduced options mentioned above, the task of solving package management in Python is hard – even NP-hard. Package managers are forced to either develop heuristics about where to scan for dependencies, or to actually download large binaries before resolving the full dependency tree.

What to expect from modern packaging solutions?

Recently we went through the process of standardising several Python machine learning projects. Our goals were to arrive at well-established standards for packaging from other programming communities such as node, golang, ruby or rust:

  • Install multiple Pythons and use a dedicated Python version per project.
  • Isolate 3rd party packages per project.
  • Organize code under a top importable module namespace.
  • Use a deterministic locking solution.
  • Install the same versions of transitive dependencies on different platforms – e.g. Apple Silicon arm64 for development and Linux amd64 for production.
  • Enable tool-assisted dependency upgrades through the use of semantic versioning policies.
  • Automated dependency updates and CVE scans of all transitive dependencies.

Currently we still use a mix of pip, venv, pip-compile, direnv and the system package manager to install the actual Pythons. Apart from the drawback of having everyone in the team having to learn how to operate this mix of tools, there is also the downside of not having a single cross-platform lockfile.

The new package manager "UV" shows promise in tackling tooling proliferation, streamlining workflows, and simplifying dependency management for developers. This article examines how UV achieves these benefits.

Why UV and not X?

Here is a short list of key selling points for UV:

  • UV has an efficient dependency solving algorithm written in Rust to address the NP-hard resolution problem.
  • UV unites installing Pythons, isolating dependencies (virtual environment), installing packages (pip) and resolve packages per project (venv activate).
  • UV supports cross-platform deterministic lockfiles, allowing the same versions of transitive dependencies to be installed on different OS and CPU architectures.

A brief list of drawbacks of other tools:

Pip + venv + pip-tools

  • Requires everyone in the team to install and learn a mix of tools.
  • Requires manual management of virtual environments and activation of each environment when switching between projects.
  • The development workflow is not enforced in the tooling and must be established as a team best practice.
  • Lockfiles are not cross-platform and can only be created for the same platform pip-tools was run from. For example, a full requirements.txt with all transitive dependencies created on Apple Silicon may contain packages that are not installable on Linux amd64 and will cause failed builds.

Poetry

  • May become slow until not being usable at all due to large resolution possibilities.
  • May install wrong dependencies if package authors are not consistent in declaring metadata on PyPa according to the actual dependencies a package installs.

Conda

  • Uses its own packaging format and repositories which have fewer packages than PyPa (pip).
  • To address the above, ships a pip interoperability solution, which in practice often leads to accidentally overwriting Conda transitive dependencies with pip, causing broken builds.
  • Has neither cross-platform nor deterministic locking that works together with a semver policy and allows minor version upgrades (a tool addressing this in the Conda world is Mamba).

How to organize a Python project with UV?

UV manages the entire Python development workflow and abstracts away the management of the virtual environment by running Python commands through an UV wrapper command.

To initialize a project named "uv-light", run:

uv python install 3.12
uv init uv-light —python 3.12


Add the following project dependencies:
uv add flask pandas pyarrow

The above commands create a minimal package directory layout with the recent PEP 621 standard for project configuration in a pyproject.toml file. PEP 508 dependency specifiers, commonly known as the requirements.txt spec, are added to the dependencies section: "flask>=3.0.3", "pandas>=2.2.1", "pyarrow>=17.0.0".

A good practice is to change the above >= dependency specs to a stricter == X.* semver policy that locks the package major versions. This will allow for a later upgrade of all minor versions of the full transitive dependency tree. Such an update aims to upgrade each package to the latest version that does not break the compatibility of the build through the semver convention.

Resulting pyproject.toml configuration:

[project]
name = "uv-light"
# ...
requires-python = ">=3.12"
dependencies = [
"flask==3.*",
"pandas==2.*", # Jede Pandas 2, aber nicht 3
"pyarrow==17.*",
]


The actual code can now be organized as a directory layout below:

./uv-light
├── uv_light
│ ├──__init__.py
│ ├── lens.py
│ ├── beam.py
├── main.py
├── pyproject.toml
└── uv.lock


The above directory layout allows to import from uv_light:

# main.py
from uv_light.lens import Lens
from uv_light.beam import Beam
Beam().project_on(Lens())


To run the uv_light main file within its virtual environment use the uv run command:

uv run main.py

How to share and deploy code?

Another team member can now checkout the project and simply run the uv run command above. UV will take care of installing the exact same Python and package versions that were used to create this program. To achieve this, UV generates a cross-platform lockfile uv.lock, which is checked in to the source code repository.

The exact same strategy can be used for production deployment, provided UV is installed in the production environment, e.g. within a Docker container. Some cloud services may still require a requirements.txt dependency spec instead. For this use case, UV offers a pip-compatible interface with an uv export command. This command allows you to target different deployment platforms.

To create a requirements.txt listing all transitive dependencies for each platform, run:

uv export --no-hashes -o requirements.txt

How to maintain package updates?

UV can upgrade all packages within a policy defined in pyproject.toml. This allows to pursue a backwards compatible upgrade strategy. In this way, the version ranges of the requirements spec of directly included packages are respected. All transitive dependencies are upgraded to the latest compatible version towards the directly included packages and between each other.

To perform such an upgrade, run:

uv lock –upgrade

In the example used in this post, pandas will stay at 2 , flask at 3 and pyarrow at 17. The following would be a semver-compatible update:

Updated flask v3.0.3 -> v3.1.0

Updated pandas v.2.2.1 -> 2.2.3
Updated numpy v1.26.4 -> v2.1.3 # transitive

When you are ready to do a major update, patch the affected code of a breaking API change and manually bump the major version in the pyproject.toml dependencies section.

Afterwards run uv lock again. This will result in the following change:

--- a/pyproject.toml
+++ b/pyproject.toml
dependencies = [
"flask==3.*",
"pandas==2.*",
- "pyarrow==17.*",
+ "pyarrow==18.*",
]

--- a/uv.lock.toml
+++ b/uv.lock.toml
[[package]]
name = "pyarrow"
-version = "17.0.0"
+version = "18.0.0"

How to automate dependency updates?

It is considered best practice to regularly update dependencies to: 

  • avoid being exposed to vulnerabilities
  • to limit incompatibilities between dependencies,
  • to avoid complex upgrades when upgrading from an outdated version

A widely adopted dependency update solution is Dependabot. Unfortunately at the time of writing, Dependabot does not support uv out of the box. The progress can be tracked here. Github has officially committed to add first class uv support to Dependabot.

For early adopters of UV requiring Dependabot to work, a pip-compile workflow can be used in the same way as described above for generating the older requirements.txt spec. The pip-compile workflow is first class supported by Dependabot and expects the same project layout structure as generated by UV: a pyproject.toml with dependencies section and a requirements.txt file with locked dependencies.

Dependabot recognizes a pip-compile setup by a comment within the generated requirements.txt file. Change the comment generated by UV to the following pip-compile comment:

#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile pyproject.toml
#


Add a repective .github/dependabot.yml config to the project:

version: 2
updates:
     - package-ecosystem: "pip"
        schedule:
             interval: "daily"
        groups:
             patches:
                update-types:
                - "minor"
                - "patch"
       open-pull-requests-limit: 100


Dependabot will start creating pull requests with package updates in requirements.txt and pyproject.toml. The above config groups non-breaking compatibility updates into a single pull request. It will also trigger Github to scan all dependencies listed requirements.txt for CVEs. This step can also be automated on CI. See examples at uv-sync.sh and workflows/push.vml.

Summary

Python has historically come with different project packaging options. Third-party vendors have tried to solve Python packaging, each focusing on different aspects, resulting in a fragmented ecosystem. UV is a recent promising attempt to bring modern packaging to Python. It solves cross-platform deterministic locking and automatic isolated environment management, allowing a homogeneous environment to be shared across different machines for development and deployment.

Want to see a full example repository using the concepts outlined in this article? You can find one here.

Want to become part of the team?

12 people like this.

0No comments yet.

Write a comment
Answer to: Reply directly to the topic

Written by

Andreas Winschu
Andreas Winschu
Data Engineer

Similar Articles

Saved!

We want to improve out content with your feedback.

How interesting is this blogpost?

We have received your feedback.