Does
make suck?
12 March 2025
Does make
suck,
actually?
GNU make
is completely ubiquitous. It is
probably the pre-eminent software available for
building programs. At the time of writing this, I even use it to
build this site. But over time I have come to believe some of
its features are actually problems, and perhaps something should
replace it.
What is
make
There are many perfectly good explanations of what
make
is and does online. I will just summarise
here: make
is a program, which takes as input a
specification of a project (in a special language, written in a
so-called Makefile
). The specification consists of
a list of rules, where each rule declares:
- The files it proposes to build (targets),
- What command to run to build it, and
- What files are required to exist in order for the command to succeed (dependencies).
make
has the appealing feature that it will try
not to re-run unnecessary commands. It determines this by
looking at the timestamps of the various dependencies, to see if
they’ve been modified. It doesn’t look in the file to see if you
actually changed anything; it just says “this got
touched”, I’ll rebuild now”.
This actually works extremely well in many cases – well enough to have become the standard tool. But I have come to regard it as problematic.
The good
make
is used by everyone, everywhere. If you have a project that’s built with aMakefile
, most free software people will know how to build it. It has not been installed by default on every Linux distro I’ve used, but it’s always been available.- Calling
make
to build is an extremely simple interface. - It’s old (almost 40 years). I think there is some value in keeping very long-running software projects alive. It says something important about the stability of software generally.
I think there is also a lot of help for building C projects specifically, but I don’t really understand what.
The bad
Some behaviour I think is just wrong and actually causes
problems. I don’t think make
can fix these issues
without fundamentally changing.
- Changes to the
Makefile
are not taken into account. So if one of your rules is altered in such a way that a previously-built target would have been built differently, that will not trigger a rebuild unless youtouch
the dependencies. - Files that are removed as targets will never be cleaned up. Particularly problematic if you are, say, building something against a new branch and then return to the main one.
- Files that are touched, but not changed, trigger unnecessary rebuilds. This can cascade, particularly when you build artefacts that are composed of many pieces at once.
The ugly
Some design choices are a matter of taste, and not overtly wrong. But I think I dislike them anyway.
- The
Makefile
syntax is extremely opaque. I have no idea how to remember the difference between$<
and$@
. Simple things like wildcard expansion are a whole thing.
What else is there?
I don’t know, really. There are a whole bunch of build
systems, but all of them seem to have a steeper learning curve
than make
. There is bazel , the build system Google
uses. Their abstraction seems to represent the concept of build
actions and dependencies clearly, so I think it should solve the
major “bad” behaviour I mentioned above. But when I go looking
for testimonials, I see things like “Bazel
is ruining my life”. So I don’t want to try that so
much.
At the end of one of the FOSDEM talks there was a group chat about whether guix could represent a build process itself. After all, this is what a derivation is supposed to be, I think? I guess I should read about it.