Is it worth using make?

Sat, 02 March 2019 :: #rant :: #cmake :: #meson :: #make :: #cpp :: #c

You may think you've created a nice and tidy Makefile, which tracks dependencies and works across many different operating systems, but ask yourself these questions (and answer honestly):

Which compiler does your Makefile support?

Is it GCC? Or Clang? Some of their options are pretty similar, so it may not be very hard to make your Makefile work for both. But does it support e.g. Visual Studio?

Does your Makefile support out-of-source build?

Polluting the source tree with autogenerated files is often not a very good idea. It makes source code versioning harder, even if you use .gitignore, you can't use a different partition to store your compiled object files, you can't easily clean your build (nothing is easier than simply removing the build directory with rm -rf), you can't use a read-only partition to store your source, etc.

Does your Makefile support Windows at all?

Windows often uses a different set of rules when it comes to building software. More than that, it uses its own compiler (Visual Studio), with different switches and different options. You may want to target only MinGW (which basically is GCC for Windows), but in this case you still won't be able to run your clever scripts embedded into Makefile rules (grep, sed, if, test -- nothing is supported). You won't even have the make tool installed there, so there will be nothing that could interpret your Makefile. You might want to target MSYS2 environment so you will have support for make command and whole binutils package, but how many Windows users are using MSYS2? I think not too many.

Is your Makefile a GNU makefile, or BSD makefile?

You don't simply create a Makefile. What interpreter do you want to use? A GNU make, or a BSD make? Or maybe Watcom make, nmake, Borland make, Sun dmake, Sun DevPro make? If you want to see what it really takes to support lots of different Makefile dialects, operating systems and compilers, you can look at zlib's repository, and check how they're handling things. Using Makefile is anything but easy. Still, you can get away with simply requiring to use the GNU make, which is more popular than BSD make (but requires to use gmake instead of make on BSD systems).

Does your Makefile support cleaning the project from all autogenerated artifacts?

If you're using git, you can maybe sometimes get away with git clean -fd. But if you're not using out-of-source builds, you have to give users the ability to clean the project tree so they can perform a clean re-build of the project, because there are situations that will need this functionality.

Do you support a situation when the compiler/SDK will be upgraded on the system?

When was the last time you've run apt-get update or pacman -Syu? Yesterday? Did you remember if GCC was updated? Because if it was updated, chances are that some object files need to be recompiled. Does your Makefile support detection when a file will need to be recompiled? You can get away with this by allowing the user to clean the build and do a clean re-compile, this would be enough for most cases.

Do you track the dependencies on the libraries installed in the system?

So in your project you're using zlib to uncompress some files. What happens when you update zlib to a new version? Some of your object files will have old definitions of structs and function signatures, which may not reflect what's really going on in current version's zlib. You need to perform a clean build to be sure every object file will be up-to-date with system libraries. Or you risk some random issues popping up from time to time in the runtime.

Do you support setting a Release/Debug build of your project?

If you put a variable in your Makefile that needs to be toggled to get a Debug or Release build, you're making it harder for the people to use your project during development, because this change will pop up every time they'll use git status. It will be harder for them to switch branches. If you don't support Release/Debug targets at all, then you're making it even harder.

Does your Makefile support passing custom CFLAGS or LDFLAGS?

It's important if you want to add e.g. fuzzing support to your project. Or maybe you want to use the zlib version installed in some directory, not the system zlib. Or maybe 100 other reasons. Thing is, people expect this and it's a good idea to have this possibility.

Are you using thirdparty libraries in your project?

How do you perform library discovery to check if it's available in the system? Does the build simply stops on missing file f_foobar.h file? Or maybe you actually check if the library headers are installed? Are you using pkg-config as a discovery tool? Are you sure you don't need to add an explicit linkage to pthreads like on some Fedora versions? Are you sure your ninja builder is installed as ninja, not as ninja-build as on some Linux distributions? Are you sure that a library you want to use is installed in /usr/include, not in /usr/local/include as on some BSD systems? Or maybe it's a completely different path, maybe it was installed by brew and user is running macOS?

I don't want to use CMake, because the project is small and it's not worth it.

Did you know that a minimal fully functional CMake script that supports all of the points from this list takes only 2 lines?

cmake_minimum_required(VERSION 3.5)
add_executable(app main.cpp)

So "Makefile is less complicated" argument doesn't seem to be true.

What if someone just wants to use Eclipse, Xcode, Visual Studio, CodeBlocks, etc?

With your build system they have to use whatever setup you have. And people prefer to use their setups, because they like them. They may have inferior knowledge, they may require indoctrination and enlightenment, sure, but for now they simply want to try to compile your project and that's it.

Does your Makefile support showing the full command line used to compile a compilation unit?

Sometimes it's required to debug 'unresolved symbol' linker errors. Or sometimes the user will want to preprocess a compilation unit instead of compiling it. It will require some Makefile hacking in order to do this. You can check how easy it is to run a preprocessor for a selected C++ source file when using CMake: here. OTOH, fortunately the default usage of make will take care of this point correctly, since it will output full command line that was used to produce an object file, but some people change the Makefile to hide this info and produce only CC file.o. And that's fine, unless there's a problem during compilation.


Did you know? Using Makefile generators like CMake and Meson will implement every item on this list automatically without you even knowing about it. Well, maybe point 3 (Windows + Visual Studio support) can require some additional effort, but it'll still be less than when writing pure Makefiles. There is no need to rant about the impossibility of debugging the resulting Makefiles; you debug your source CMake's or Meson's build script, not the resulting Makefiles.

Also, as a bonus, your script will allow itself to be copied and pasted without issues, which is not always true for Makefiles (as they require using tab characters to indent lines, and some tools expand tabs to spaces).