How are you maintaining your requirements.txt file? Are you adding and removing your dependencies manually or you’re just running pip freeze > requirements.txt?

Whichever of these two ways you use, you’re doing it wrong. requirements.txt alone is not enough to build reproducible environments that will run the same wherever you put them. That’s obviously a problem, you want your production environment to be tightly defined.

Scenario #1: manually editing requirements.txt

This is how everybody in python land starts. You create a requirements.txt file and start putting dependencies your app needs. After editing the file, you run pip install -r requirements.txt to install all the dependencies into your virtual environment.

But here is the problem. Your requirements.txt contains just the first degree dependencies and their versions. Your dependencies also have dependencies (2nd+ degree), and these versions are not necessarily locked down.

Not having these versions locked down means that running pip install -r requirements.txt on different systems or at different points of time will resolve to different sets of package versions. It opens a space for security issues and your app breaking completely.

Scenario #2: pip freeze > requirements.txt

Once you’re aware of the problem above, the solution is simple, you just run pip freeze > requirements.txt.

That is certainly a solution for 2nd+ degree dependency versions, but brings a new problem – once you want to delete a dependency, how do you know that you’ve deleted all their dependencies?

There is no easy way to know. If you spend a little more time with the problem, you’ll probably figure out that you need two files - one that defines direct dependencies of your app and the second one that locks down all transitive dependencies and their versions (a lockfile). That’s a standard solution in other communities (JavaScript, ruby, Rust), but pip does not bring any conventions nor solutions for this.

This is where my anxiety started kicking in – I can easily create these two files myself, but there are no standard names for them and I have to teach everybody on the team how to use this setup I’ve come up with. Then I’ve found pip-tools.

Solution: use pip-compile (from pip-tools)

Pip-tools is a set of two tools – pip-compile and pip-sync. pip-compile solves exact problems I’ve described above. It brings a workflow (read convention) and a tool to maintain both files.

How to use pip-compile?

Create a requirements.in file and list just the direct dependencies of your app. The same way you’d do with requirements.txt in Scenario #1. Then run pip-compile (or ./venv/bin/pip-compile if not installed globally) and it will create requirements.txt, with all the dependencies listed and all the versions locked.

If you’ve used requirements.txt, notice that you can just drop in pip-compile and the rest of your system does not have to change. Whatever was building your app can still use requirements.txt, whoever was just hitting pip install -r requirements.txt can continue doing that. The only thing that has to change is how you add or remove dependencies.

Wrapping up

Pip-compile is a simple tool for locking down versions of your dependencies. It’s widely used and it brings a sense of standardization so your team does not have to learn some bespoke setup. Its’ architecture follows the unix philosophy – it solves one specific problem and can be dropped into your project without changing other systems interacting with your app.

If you don’t like the unix philosophy, Poetry and Pipenv can be used as an all-in-one solution, tackling the versioning problem, too.