Skip to content

jazzband/pip-tools

jazzband-image pypi pyversions pre-commit buildstatus-gha codecov Matrix Room Badge Matrix Space Badge discord-chat-image

pip-tools = pip-compile + pip-sync

A set of command line tools to help you keep yourpip-based packages fresh, even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)

pip-tools overview for phase II

Installation

Similar topip,pip-toolsmust be installed in each of your project's virtual environments:

$source/path/to/venv/bin/activate
(venv) $ Python -m pip install pip-tools

Note:all of the remaining example commands assume you've activated your project's virtual environment.

Example usage forpip-compile

Thepip-compilecommand lets you compile arequirements.txtfile from your dependencies, specified in eitherpyproject.toml,setup.cfg, setup.py,orrequirements.in.

Run it withpip-compileorPython -m piptools compile(or pipx run --spec pip-tools pip-compileifpipxwas installed with the appropriate Python version). If you use multiple Python versions, you can also runpy -X.Y -m piptools compileon Windows andPython X.Y -m piptools compile on other systems.

pip-compileshould be run from the same virtual environment as your project so conditional dependencies that require a specific Python version, or other environment markers, resolve relative to your project's environment.

Note:Ifpip-compilefinds an existingrequirements.txtfile that fulfils the dependencies then no changes will be made, even if updates are available. To compile from scratch, first delete the existing requirements.txtfile, or see Updating requirements for alternative approaches.

Requirements frompyproject.toml

Thepyproject.tomlfile is the latest standardfor configuring packages and applications, and is recommended for new projects.pip-compile supports both installing yourproject.dependenciesas well as your project.optional-dependencies.Thanks to the fact that this is an official standard, you can usepip-compileto pin the dependencies in projects that use modern standards-adhering packaging tools like Setuptools,Hatch orflit.

Suppose you have a 'foobar' Python application that is packaged usingSetuptools, and you want to pin it for production. You can declare the project metadata as:

[build-system]
requires= ["setuptools","setuptools-scm"]
build-backend="setuptools.build_meta"

[project]
requires- Python=">=3.9"
name="foobar"
dynamic= ["dependencies","optional-dependencies"]

[tool.setuptools.dynamic]
dependencies= {file= ["requirements.in"] }
optional-dependencies.test= {file= ["requirements-test.txt"] }

If you have a Django application that is packaged usingHatch,and you want to pin it for production. You also want to pin your development tools in a separate pin file. You declaredjangoas a dependency and create an optional dependencydevthat includespytest:

[build-system]
requires= ["hatchling"]
build-backend="hatchling.build"

[project]
name="my-cool-django-app"
version="42"
dependencies= ["django"]

[project.optional-dependencies]
dev= ["pytest"]

You can produce your pin files as easily as:

$pip-compile -o requirements.txt pyproject.toml
#
#This file is autogenerated by pip-compile with Python 3.10
#by the following command:
#
#pip-compile --output-file=requirements.txt pyproject.toml
#
asgiref==3.6.0
# via django
django==4.1.7
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
# via django

$pip-compile --extra dev -o dev-requirements.txt pyproject.toml
#
#This file is autogenerated by pip-compile with Python 3.10
#by the following command:
#
#pip-compile --extra=dev --output-file=dev-requirements.txt pyproject.toml
#
asgiref==3.6.0
# via django
attrs==22.2.0
# via pytest
django==4.1.7
# via my-cool-django-app (pyproject.toml)
exceptiongroup==1.1.1
# via pytest
iniconfig==2.0.0
# via pytest
packaging==23.0
# via pytest
pluggy==1.0.0
# via pytest
pytest==7.2.2
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
# via django
tomli==2.0.1
# via pytest

This is great for both pinning your applications, but also to keep the CI of your open-source Python package stable.

Requirements fromsetup.pyandsetup.cfg

pip-compilehas also full support forsetup.py- and setup.cfg-based projects that usesetuptools.

Just define your dependencies and extras as usual and run pip-compileas above.

Requirements fromrequirements.in

You can also use plain text files for your requirements (e.g. if you don't want your application to be a package). To use arequirements.infile to declare the Django dependency:

# requirements.in
django

Now, runpip-compile requirements.in:

$pip-compile requirements.in
#
#This file is autogenerated by pip-compile with Python 3.10
#by the following command:
#
#pip-compile requirements.in
#
asgiref==3.6.0
# via django
django==4.1.7
# via -r requirements.in
sqlparse==0.4.3
# via django

And it will produce yourrequirements.txt,with all the Django dependencies (and all underlying dependencies) pinned.

(updating-requirements)=

Updating requirements

pip-compilegenerates arequirements.txtfile using the latest versions that fulfil the dependencies you specify in the supported files.

Ifpip-compilefinds an existingrequirements.txtfile that fulfils the dependencies then no changes will be made, even if updates are available.

To forcepip-compileto update all packages in an existing requirements.txt,runpip-compile --upgrade.

To update a specific package to the latest or a specific version use the --upgrade-packageor-Pflag:

#only update the django package
$pip-compile --upgrade-package django

#update both the django and requests packages
$pip-compile --upgrade-package django --upgrade-package requests

#update the django package to the latest, and requests to v2.0.0
$pip-compile --upgrade-package django --upgrade-package requests==2.0.0

You can combine--upgradeand--upgrade-packagein one command, to provide constraints on the allowed upgrades. For example to upgrade all packages whilst constraining requests to the latest version less than 3.0:

$pip-compile --upgrade --upgrade-package'requests<3.0'

Using hashes

If you would like to useHash-Checking Modeavailable inpipsince version 8.0,pip-compileoffers--generate-hashesflag:

$pip-compile --generate-hashes requirements.in
#
#This file is autogenerated by pip-compile with Python 3.10
#by the following command:
#
#pip-compile --generate-hashes requirements.in
#
asgiref==3.6.0 \
--hash=sha256:71e68008da809b957b7ee4b43dbccff33d1b23519fb8344e33f049897077afac \
--hash=sha256:9567dfe7bd8d3c8c892227827c41cce860b368104c3431da67a0c5a65a949506
# via django
django==4.1.7 \
--hash=sha256:44f714b81c5f190d9d2ddad01a532fe502fa01c4cb8faf1d081f4264ed15dcd8 \
--hash=sha256:f2f431e75adc40039ace496ad3b9f17227022e8b11566f4b363da44c7e44761e
# via -r requirements.in
sqlparse==0.4.3 \
--hash=sha256:0323c0ec29cd52bceabc1b4d9d579e311f3e4961b98d174201d5622a23b85e34 \
--hash=sha256:69ca804846bb114d2ec380e4360a8a340db83f0ccf3afceeb1404df028f57268
# via django

Output File

To output the pinned requirements in a filename other than requirements.txt,use--output-file.This might be useful for compiling multiple files, for example with different constraints on django to test a library with both versions usingtox:

$pip-compile --upgrade-package'django<1.0'--output-file requirements-django0x.txt
$pip-compile --upgrade-package'django<2.0'--output-file requirements-django1x.txt

Or to output to standard output, use--output-file=-:

$pip-compile --output-file=->requirements.txt
$pip-compile - --output-file=-<requirements.in>requirements.txt

Forwarding options topip

Any validpipflags or arguments may be passed on withpip-compile's --pip-argsoption, e.g.

$pip-compile requirements.in --pip-args"--retries 10 --timeout 30"

Configuration

You can define project-level defaults forpip-compileandpip-syncby writing them to a configuration file in the same directory as your requirements input files (or the current working directory if piping input from stdin). By default, bothpip-compileandpip-syncwill look first for a.pip-tools.tomlfile and then in yourpyproject.toml.You can also specify an alternate TOML configuration file with the--configoption.

It is possible to specify configuration values both globally and command-specific. For example, to by default generatepiphashes in the resulting requirements file output, you can specify in a configuration file:

[tool.pip-tools]
generate-hashes=true

Options topip-compileandpip-syncthat may be used more than once must be defined as lists in a configuration file, even if they only have one value.

pip-toolssupports default values forall valid command-line flags of its subcommands. Configuration keys may contain underscores instead of dashes, so the above could also be specified in this format:

[tool.pip-tools]
generate_hashes=true

Configuration defaults specific topip-compileandpip-synccan be put beneath separate sections. For example, to by default perform a dry-run withpip-compile:

[tool.pip-tools.compile]#"sync" for pip-sync
dry-run=true

This does not affect thepip-synccommand, which also has a--dry-runoption. Note that local settings take preference over the global ones of the same name, whenever both are declared, thus this would also makepip-compilegenerate hashes, but discard the global dry-run setting:

[tool.pip-tools]
generate-hashes=true
dry-run=true

[tool.pip-tools.compile]
dry-run=false

You might be wrapping thepip-compilecommand in another script. To avoid confusing consumers of your custom script you can override the update command generated at the top of requirements files by setting the CUSTOM_COMPILE_COMMANDenvironment variable.

$CUSTOM_COMPILE_COMMAND="./pipcompilewrapper"pip-compile requirements.in
#
#This file is autogenerated by pip-compile with Python 3.10
#by the following command:
#
#./pipcompilewrapper
#
asgiref==3.6.0
# via django
django==4.1.7
# via -r requirements.in
sqlparse==0.4.3
# via django

Workflow for layered requirements

If you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.

For example, if you have a Django project where you want the newest2.1 release in production and when developing you want to use the Django debug toolbar, then you can create two*.infiles, one for each layer:

# requirements.in
django<2.2

At the top of the development requirementsdev-requirements.inyou use-c requirements.txtto constrain the dev requirements to packages already selected for production inrequirements.txt.

# dev-requirements.in
-c requirements.txt
django-debug-toolbar<2.2

First, compilerequirements.txtas usual:

$ pip-compile
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile
#
django==2.1.15
# via -r requirements.in
pytz==2023.3
# via django

Now compile the dev requirements and therequirements.txtfile is used as a constraint:

$pip-compile dev-requirements.in
#
#This file is autogenerated by pip-compile with Python 3.10
#by the following command:
#
#pip-compile dev-requirements.in
#
django==2.1.15
# via
# -c requirements.txt
# django-debug-toolbar
django-debug-toolbar==2.1
# via -r dev-requirements.in
pytz==2023.3
# via
# -c requirements.txt
# django
sqlparse==0.4.3
# via django-debug-toolbar

As you can see above, even though a2.2release of Django is available, the dev requirements only include a2.1version of Django because they were constrained. Now both compiled requirements files can be installed safely in the dev environment.

To install requirements in production stage use:

$pip-sync

You can install requirements in development stage by:

$pip-sync requirements.txt dev-requirements.txt

Version control integration

You might usepip-compileas a hook for thepre-commit. Seepre-commit docsfor instructions. Sample.pre-commit-config.yaml:

repos:
-repo:https://github /jazzband/pip-tools
rev:7.4.1
hooks:
-id:pip-compile

You might want to customizepip-compileargs by configuringargsand/orfiles,for example:

repos:
-repo:https://github /jazzband/pip-tools
rev:7.4.1
hooks:
-id:pip-compile
files:^requirements/production\.(in|txt)$
args:[--index-url=https://example, requirements/production.in]

If you have multiple requirement files make sure you create a hook for each file.

repos:
-repo:https://github /jazzband/pip-tools
rev:7.4.1
hooks:
-id:pip-compile
name:pip-compile setup.py
files:^(setup\.py|requirements\.txt)$
-id:pip-compile
name:pip-compile requirements-dev.in
args:[requirements-dev.in]
files:^requirements-dev\.(in|txt)$
-id:pip-compile
name:pip-compile requirements-lint.in
args:[requirements-lint.in]
files:^requirements-lint\.(in|txt)$
-id:pip-compile
name:pip-compile requirements.in
args:[requirements.in]
files:^requirements\.(in|txt)$

Example usage forpip-sync

Now that you have arequirements.txt,you can usepip-syncto update your virtual environment to reflect exactly what's in there. This will install/upgrade/uninstall everything necessary to match the requirements.txtcontents.

Run it withpip-syncorPython -m piptools sync.If you use multiple Python versions, you can also runpy -X.Y -m piptools syncon Windows and Python X.Y -m piptools syncon other systems.

pip-syncmust be installed into and run from the same virtual environment as your project to identify which packages to install or upgrade.

Be careful:pip-syncis meant to be used only with a requirements.txtgenerated bypip-compile.

$pip-sync
Uninstalling flake8-2.4.1:
Successfully uninstalled flake8-2.4.1
Collecting click==4.1
Downloading click-4.1-py2.py3-none-any.whl (62kB)
100% |................................| 65kB 1.8MB/s
Found existing installation: click 4.0
Uninstalling click-4.0:
Successfully uninstalled click-4.0
Successfully installed click-4.1

To sync multiple*.txtdependency lists, just pass them in via command line arguments, e.g.

$pip-sync dev-requirements.txt requirements.txt

Passing in empty arguments would cause it to default torequirements.txt.

Any validpip installflags or arguments may be passed withpip-sync's --pip-argsoption, e.g.

$pip-sync requirements.txt --pip-args"--no-cache-dir --no-deps"

Note:pip-syncwill not upgrade or uninstall packaging tools like setuptools,pip,orpip-toolsitself. UsePython -m pip install --upgrade to upgrade those packages.

Should I commitrequirements.inandrequirements.txtto source control?

Generally, yes. If you want a reproducible environment installation available from your source control, then yes, you should commit bothrequirements.inandrequirements.txtto source control.

Note that if you are deploying on multiple Python environments (read the section below), then you must commit a separate output file for each Python environment. We suggest to use the{env}-requirements.txtformat (ex:win32-py3.7-requirements.txt,macos-py3.10-requirements.txt,etc.).

Cross-environment usage ofrequirements.in/requirements.txtandpip-compile

The dependencies of a package can change depending on the Python environment in which it is installed. Here, we define a Python environment as the combination of Operating System, Python version (3.7, 3.8, etc.), and Python implementation (CPython, PyPy, etc.). For an exact definition, refer to the possible combinations ofPEP 508 environment markers.

As the resultingrequirements.txtcan differ for each environment, users must executepip-compileon each Python environment separatelyto generate a requirements.txtvalid for each said environment. The samerequirements.incan be used as the source file for all environments, using PEP 508 environment markersas needed, the same way it would be done for regularpipcross-environment usage.

If the generatedrequirements.txtremains exactly the same for all Python environments, then it can be used across Python environments safely.Butusers should be careful as any package update can introduce environment-dependent dependencies, making any newly generatedrequirements.txtenvironment-dependent too. As a general rule, it's advised that users should still always executepip-compile on each targeted Python environment to avoid issues.

Maximizing reproducibility

pip-toolsis a great tool to improve the reproducibility of builds. But there are a few things to keep in mind.

  • pip-compilewill produce different results in different environments as described in the previous section.
  • pipmust be used with thePIP_CONSTRAINTenvironment variable to lock dependencies in build environments as documented in#8439.
  • Dependencies come from many sources.

Continuing thepyproject.tomlexample from earlier, creating a single lock file could be done like:

$pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
#This file is autogenerated by pip-compile with Python 3.9
#by the following command:
#
#pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
asgiref==3.5.2
# via django
attrs==22.1.0
# via pytest
backports-zoneinfo==0.2.1
# via django
django==4.1
# via my-cool-django-app (pyproject.toml)
editables==0.3
# via hatchling
hatchling==1.11.1
# via my-cool-django-app (pyproject.toml::build-system.requires)
iniconfig==1.1.1
# via pytest
packaging==21.3
# via
# hatchling
# pytest
pathspec==0.10.2
# via hatchling
pluggy==1.0.0
# via
# hatchling
# pytest
py==1.11.0
# via pytest
pyparsing==3.0.9
# via packaging
pytest==7.1.2
# via my-cool-django-app (pyproject.toml)
sqlparse==0.4.2
# via django
tomli==2.0.1
# via
# hatchling
# pytest

Some build backends may also request build dependencies dynamically using theget_requires_for_build_hooks described inPEP 517andPEP 660. This will be indicated in the output with one of the following suffixes:

  • (pyproject.toml::build-system.backend::editable)
  • (pyproject.toml::build-system.backend::sdist)
  • (pyproject.toml::build-system.backend::wheel)

Other useful tools

Deprecations

This section listspip-toolsfeatures that are currently deprecated.

  • In the next major release, the--allow-unsafebehavior will be enabled by default (#989). Use--no-allow-unsafeto keep the old behavior. It is recommended to pass--allow-unsafenow to adapt to the upcoming change.
  • The legacy resolver is deprecated and will be removed in future versions. The new default is--resolver=backtracking.
  • In the next major release, the--strip-extrasbehavior will be enabled by default (#1613). Use--no-strip-extrasto keep the old behavior.

A Note on Resolvers

You can choose from either default backtracking resolver or the deprecated legacy resolver.

The legacy resolver will occasionally fail to resolve dependencies. The backtracking resolver is more robust, but can take longer to run in general.

You can continue using the legacy resolver with--resolver=legacyalthough note that it is deprecated and will be removed in a future release.