or: please stop specifying your build dependencies explicitly, at all.
or: please, I'm begging you, at least consider SCons and if it could make your life better.
Makefile is a declarative programming language, which lets you specify build dependencies. I cannot remember how to write one, not really, but the syntax is roughly:
target: file1 file2 file3 command_to_run
On top of that are layered a few things to make the process a bit less painful: for instance little macros or variables, so you can specify the compiler to use as an environment variable, and so on.
Once you have a makefile, if you edit file1, file2, or file3, target will build. Oh-- and I think any of the files, themselves, can be targets, so you are specifying a kind of dependency tree.
All good, there's really nothing wrong with a makefile, except:
There are some other problems, but they mostly relate to that. In practice, handwritten Makefiles tend to be vaguely managable, but messy, complicated, and not the best part of the day if they are open in your editor.
Autotools are a set of programs that generate makefiles. You define dependencies in another declarative programming language— this language is defined by a set of m4 macros, kind of— and then you run automake. Or maybe autoconf firest, then automake. Anyways this does not create a makefile. Instead, it creates something called a configure script. The configure script is an unbelievable monstrosity— often, if not normally, tens of thousands of lines. Think: "view source for a typical web page, but both better and worse", not "a normal bash script."
Along with this configure script, autotools generates just a ton of weird cruft files that nobody really needs and are often left at strange default tempates. I think.
Aside: I've been working in software for a bit. I do not understand autotools, but I can run configure scripts pretty good, and I know enough to know when they aren't actually doing what they supposed to do, but are still making a makefile that will make a thing, until you sort of examine it a bit more closely with ldd.
But I am the only person I know who has ever fixed something in a broken autoconf file, i.e., gone further than figuring out what to pass to the configure file to trick it.
Here is part of the autoconf file for wxWidgets:
dnl WX_SYS_LARGEFILE_MACRO_VALUE(C-MACRO, VALUE, CACHE-VAR)
define(WX_SYS_LARGEFILE_MACRO_VALUE,
[
AC_CACHE_CHECK([for $1 value needed for large files], [$3],
[
AC_TRY_COMPILE([#define $1 $2
#include <sys/types.h>],
WX_SYS_LARGEFILE_TEST,
[$3=$2],
[$3=no])
]
)
if test "$$3" != no; then
wx_largefile=yes
AC_DEFINE_UNQUOTED([$1], [$$3])
fi
])
The wxWidgets project is a very good project and their autoconf code is very well organized. It overall just works very well. If you look at it, you are probably looking at the best autoconf code you will find, just about, I think, except I try not to ever look at these files.
They do have a certain flavour, but there is a lot wrong with autotools itself:
Now, there is also something good about autotools: despite this absolute insane amount of baggage, obscurity, and intermediarocrity, the configure script for the package you build will tend to more or less work. The incorrect shared library linking thing I call out is real— actually it might be why Docker exists— but usually with autotools, you ship the configure script, so the only dependencies are bash and make, and it can build reasonably well cross-platform. If you have run ./configure && make before, you've probably built an autotools package.
1 not a computer type of wizard, I mean a wizard such as merlin, who can summon a demon, and the demon knows how to program computers, but this all requires some terrible sacrifice...
Cmake is something else, it has colourful progress bars, various stars and other percentages appear during build, it's fun, but also crappy: like a BBS using the BBS software that doesn't allow hi-ASCII, if that makes sense, it's got ASCII art going for it but not really the good ascii art. It feels a bit like systemd in that regard, or Docker build since they added the wierd buildx thing. Why they do this???
You need cmake to build a cmake project. This is good, because it avoids the silliness of avoiding that dependency and so trying to write a meta-meta build file. Just like, have the thing that builds the thing, please, and if needed, person user building can find and install it. "This project requires CMake 3.0.1 or higher." Yep that's absolutely fine, I'm on board, just promise you won't write any intermediate files.
I don't have opinions on cmake beyond this because I've never looked at a .cmake file, and I'm not starting now.
Finally, I hear there are walled gardens where you do things in a gui and your ability to choose how you make software is, in any meaningful sense of the word, completely out of your control.
This brings us to SCons.
A SCons build script is a Python program that defines a dependency tree that is then passed to SCons itself, which is a fairly adept dependency tree solver and compiler wrangler.
You can write a terrible SCons file that misses the whole point. This often looks like, well, a Makefile in python, that includes sub-makefiles in subdirectories, that are hard to follow through.
Tonight, as I write this, I am begging you. Rethink this whole shitshow. Probably you should drop CMake if you are using it. There might be other good-as things, but even if there are, listen: SCons is your buddy.
At the risk of repeating myself, a SCons build script is a Python program that defines a dependency tree that is then passed to SCons itself, which is a fairly adept dependency tree solver and compiler wrangler.
Do not specify your dependency tree in your SCons build script. There I have used bold for the first and last time here. Write a program that specifies your dependency tree. Do it in a way that makes sense to you, with however you organize your project. SCons can solve build problems you never dreamed of.
Finally, I will reiterate: a SCons build script is a Python program that defines a dependency tree that is then passed to SCons itself, which is a fairly adept dependency tree solver and compiler wrangler.
For instance:
Everything here except the last point is something I've done with SCons, my build scripts are typically just a couple hundred lines to take care of all this, and it works reliably and wonderfully well. Not only that, the way I set it up, if I want to do a development build, it manages the compiled object files (intermediate files) separately for each build type— i.e., if I switch between a special debug or profiling build, it doesn't rebuild my whole project, it properly keeps track of what has changed and what needs to happen and so on. This isn't a SCons feature, just how I set it up, because I want that.
My build scripts aren't perfect, they are messy and idiosyncratic. They have random compiler options that I used to use commented out. In other words, they are doing what I want or need in the specific ways I want or need them.
This works for two reason:
I want to pause here: I have seen a lot of crap that gets close to this. Typically, you see a custom programming language that can "intelligently" define dependencies.
No, no, no, no, no, no, no, a thousand times no. I want to crawl through the screen right now and sit down and just beg you, offer you a treat to just think for a second: don't do this, it's so tedious and limiting. You are a programmer. You ought to be programming, not data-entrying!
It also means the authors of that custom programming langauge have implicitly defined how your build has to work. You are subject to their opinions. This is terribly annoying, and many real-world problems exist that they will have never formed an opinion on, so you aren't going to be covered. The reason I won't touch CMake is I think it's this. I've looked very briefly at Nix build scripts and I think they are this. (You can gently shout at me if I'm wrong, but I won't hear it and if I did I would simply shrug and say, no ill intent on my part— I'm not trying to compare SCons to build systems I have never used and definitely not trying to move you away from something you like...)
SCons on the other hand will make you creative. For instance, right now I'm really thinking hard if I want to just put a build rule to have GNU Image Manipulation Program dump the layers from .xcf files into .png files, where the filenames are taken from the layer names. I can do it because GNU Image Manipulation Program has scriptableness and I can write a tiny bit of Python code (in my SCons build script) to scan my .xcf files and I can then specify a builder that wil tell SCons a build rule for it. This means I would never have to export another .png again. I should do this. I probably won't bother though.
Are you really telling me you don't want a build system where you just type "make", and all dependencies, including all resource dependencies, are automatically built for you? Like, if I build on windows, it builds the icon file and links it in for me. And— and this is so crucial— it's not even hard to make it happen, and it will happen with perfect reliability, and you can understand the code? And it can happen in a way that's useful to you, whatever your preferences or whatever the specificity of your situation.
Anyways, I just work so that more people would see the light on this.
First, I used to cut and paste a few common functions for different projects, but now I have a common build_helpers.py that I include. It looks like this:
##
## build_helpers.py -- simple helper functions for building with SCons
##
import os;
import string;
import sys;
import re;
import platform;
import subprocess;
##
## get the build type based on OS platform
## - returns one of "linux64", "win64", "macosx64", or None if no build type could be detected
##
def get_build_type ():
BuildPlatform = platform.system ()
BuildType = None
if ("Linux" == BuildPlatform):
BuildType = "linux64"
elif ("Windows" == BuildPlatform or BuildPlatform.startswith ("MINGW64_NT")):
BuildType = "win64"
elif ("Darwin" == BuildPlatform):
BuildType = "macosx64"
else:
print ("Unknown build platform " + BuildPlatform)
return BuildType
##
## helper for scan_pkg_config; try executing pkg-config with the given command, and return the output
##
def get_pkg_config_stdout (pkg_config, option):
if ("Windows" == platform.system ()):
raise Exception("build_helpers.get_pkg_config_stdout does not work on windows; please add dependencies/flags manually")
PkgConfigProcess = subprocess.Popen ([pkg_config, option], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(PkgConfigSTDOUT, PkgConfigSTDERR) = PkgConfigProcess.communicate ()
PkgConfigExitCode = PkgConfigProcess.wait ()
if (0 != PkgConfigExitCode) or ('' != PkgConfigSTDERR):
raise Exception("could not execute " + pkg_config + " --libs: exit code " + str(PkgConfigExitCode) + "; " + PkgConfigSTDERR.strip ())
return PkgConfigSTDOUT
##
## see scan_pkg_config; scan output from a pkg-config command with --cflags, --libs in case we can't call pkg-config directly from Python
##
def scan_pkg_config_from_stdout (CFlagsOutput, LibsOutput, env, StaticLibs):
for CFlagsPart in CFlagsOutput.split ():
env.Append (CXXFLAGS=CFlagsPart)
for LibsPart in LibsOutput.split ():
if LibsPart.endswith (".a"):
LibsPart = LibsPart [:-2]
LibsPart = os.path.basename (LibsPart)
if (LibsPart.startswith ("lib")): LibsPart = LibsPart [3:]
StaticLibs.append (LibsPart)
elif LibsPart.startswith ("-l"):
env.Append (LIBS=[LibsPart [2:]])
else:
env.Append (LINKFLAGS=LibsPart)
##
## execute the given pkg-config command with --cflags, --libs and decompose the results into env, and the list of StaticLibraries
## - returns Flags, Libs, StaticLibs; all arrays of strings
## - StaticLibs entries will be reduced to their basename (path thrown away) and un-prefixed "lib", since we require all static libs to be in StaticLibsPath later on
##
def scan_pkg_config (pkg_config, env, StaticLibs):
CFlagsOutput = get_pkg_config_stdout (pkg_config, "--cflags")
LibsOutput = get_pkg_config_stdout (pkg_config, "--libs")
scan_pkg_config_from_stdout (CFlagsOutput, LibsOutput, env, StaticLibs)
##
## scan the given directory for source files; does not scan subdirectories
## - for now, this doesn't build into libraries; eventually we will want to do that
##
def scan_directory (SourceFiles, BuildPrefix, DirectoryName):
AllFiles = os.listdir (DirectoryName)
for CurFile in AllFiles:
if re.search ("\.cc", CurFile) or re.search ("\.cpp", CurFile):
SourceFiles.append (BuildPrefix + "/" + CurFile)
def scan_all_directories (SourceFiles, BuildPrefix, DirectoryPrefix, DirectoryNames):
for Dir in DirectoryNames:
scan_directory (SourceFiles, BuildPrefix + "/" + Dir, DirectoryPrefix + "/" + Dir)
##
## scan all directories, and build into libraries
## - src/ is implied for all source directory names, there is no needed directory prefix for now
##
def scan_all_directories_into_static_libs (env, SourceFiles, BuildPrefix, DirectoryNames):
for LibDir in DirectoryNames:
LibName = LibDir.replace ('/', '_')
LibFiles = []
scan_directory (LibFiles, BuildPrefix + "/" + LibDir, "src/" + LibDir)
if (len (LibFiles) > 0):
env.StaticLibrary (BuildPrefix + "/" + LibName, LibFiles)
SourceFiles.append (BuildPrefix + "/lib" + LibName + ".a")
##
## as above, but for common files;
## - this one builds into libraries
##
def scan_common_directory (env, SourceFiles, BuildPrefix, LibName, DirectoryName):
AllFiles = os.listdir (DirectoryName)
LibSourceFiles = []
for CurFile in AllFiles:
if re.search ("\.cpp", CurFile):
TargetFile = os.path.splitext (CurFile) [0]
TargetFile = BuildPrefix + "/" + TargetFile + ".o"
env.StaticObject (target = TargetFile, source = DirectoryName + "/" + CurFile)
LibSourceFiles.append (TargetFile)
LibFile = BuildPrefix + "/lib_" + LibName + ".a"
env.StaticLibrary (target = LibFile, source = LibSourceFiles)
SourceFiles.append (LibFile)
def scan_all_common_directories (env, SourceFiles, BuildPrefix, DirectoryPrefix, DirectoryNames):
for Dir in DirectoryNames:
LibName = Dir.replace ('/', '_')
scan_common_directory (env, SourceFiles, BuildPrefix + "/" + Dir, LibName, DirectoryPrefix + "/" + Dir)
That's 176 lines and lots of them blank.
Then an actual build looks more like (this is for my bespoke raw image editor, septanon):
##
## septanon - SConscript file
##
import os;
import re;
import subprocess;
import sys;
sys.path.append (os.getcwd () + "/../packages/python")
import build_helpers;
## Sort out build type
BuildType = build_helpers.get_build_type ()
## BuildType = "linux64_profiling"
BuildExt = BuildType
StaticLibs = [] # '../packages/local/lib/libpng16.a', '../packages/local/lib/libraw.a']
if (BuildType == "macosx64"):
print ("Building for macOS 64-bit")
env = Environment (CXX="g++", CC="gcc")
env.Append (CXXFLAGS="-std=c++11 -O3 -fvectorize -fslp-vectorize-aggressive -g -Wno-unused -Wno-deprecated")
env.Append (CPPPATH=["../packages/local/include", "src/"])
env.Append (LINKFLAGS="-g -L../packages/local/lib")
env.Append (LIBS=['raw_r', 'png16', 'jpeg'])
env.Append (LINKFLAGS="-framework SDL2")
elif (BuildType == "linux64"):
print ("Building for linux 64-bit")
env = Environment (CXX="g++", CC="gcc")
env.Append (CXXFLAGS="-std=c++11 -O3 -g")
env.Append (CPPPATH=["../packages/local/include", "src/"])
env.Append (LINKFLAGS="-g -L../packages/local/lib")
env.Append (LIBS=['z', 'gomp', 'SDL2'])
StaticLibs = ['../packages/local/lib/libraw_r.a', '../packages/local/lib/libpng16.a', '../packages/local/lib/libjpeg.a']
elif (BuildType == "linux64_profiling"):
print ("Building for linux 64-bit, profiling enabled")
env = Environment (CXX="g++", CC="gcc")
env.Append (CXXFLAGS="-std=c++11 -O3 -pg")
env.Append (CPPPATH=["../packages/local/include", "src/"])
env.Append (LINKFLAGS="-pg -L../packages/local/lib")
env.Append (LIBS=['z', 'gomp', 'SDL2'])
StaticLibs = ['../packages/local/lib/libraw_r.a', '../packages/local/lib/libpng16.a', '../packages/local/lib/libjpeg.a']
elif (None == BuildType):
print ("Could not determine build type")
exit (-1)
else:
print ("Unknown build type: " + BuildType)
exit (-1)
BuildVariant = BuildType
## Get the git hash so we can embed it
GitHash = subprocess.check_output (['git', 'rev-parse', '--short', 'HEAD']).decode ('ascii').strip ()
env.Append (CXXFLAGS="-DSEPTANON_VERSION=\\\"septanon-" + GitHash + "\\\"")
## Scan
print ("Scanning directories...")
BuildPrefix = "#build/septanon/" + BuildVariant
env.VariantDir (BuildPrefix, "src", duplicate=0)
SourceFiles = []
build_helpers.scan_directory (SourceFiles, BuildPrefix, "src")
build_helpers.scan_directory (SourceFiles, BuildPrefix + "/algorithm", "src/algorithm")
build_helpers.scan_directory (SourceFiles, BuildPrefix + "/controller_mode", "src/controller_mode")
build_helpers.scan_directory (SourceFiles, BuildPrefix + "/downscaler", "src/downscaler")
build_helpers.scan_directory (SourceFiles, BuildPrefix + "/test", "src/test")
for StaticLib in StaticLibs:
SourceFiles.append (StaticLib);
## Set program rule
print ("Setting program rule...")
env.Program ("#septanon." + BuildExt, SourceFiles)
Finally, here is an example of how, for The Real Texas, I automatically export .sfxr files for sound effects:
if ("linux32" == BuildType and BuildWAVs):
print ("Building .wav files from .sfxr files...")
if (BuildWAVsProcessing):
wav_bld = Builder (action = 'sfxr -c $SOURCE ; ./process_wav $TARGET')
else:
wav_bld = Builder (action = 'sfxr -c $SOURCE ; trimsound $TARGET')
env.Append (BUILDERS = { 'Wav' : wav_bld })
def scan_wavs_directory (env, DirectoryName, PreloadFile, PreloadDirectoryName=False):
AllFiles = os.listdir (DirectoryName)
if (not PreloadDirectoryName): PreloadDirectoryName = DirectoryName
for CurFile in AllFiles:
if re.search (r"\.sfxr", CurFile):
CurWavFile = re.sub (r"\.sfxr", ".wav", CurFile)
env.Wav (DirectoryName + "/" + CurWavFile, DirectoryName + "/" + CurFile)
PreloadFile.write (PreloadDirectoryName + "/" + CurWavFile + "\n")
## NOTE: don't build main game wavs for now, until we compare with old builder
#PreloadFile = open ("data/sound_files_preload.txt", "w")
#scan_wavs_directory (env, "data/sound/step", PreloadFile)
#scan_wavs_directory (env, "data/sound/enemy", PreloadFile)
#scan_wavs_directory (env, "data/sound/object", PreloadFile)
#scan_wavs_directory (env, "data/sound/ui", PreloadFile)
#scan_wavs_directory (env, "data/sound/event", PreloadFile)
#scan_wavs_directory (env, "data/sound/environment", PreloadFile)
DLCPreloadFile = open ("cellpop_i/data/sound_files_preload.txt", "w")
scan_wavs_directory (env, "cellpop_i/data/sound/enemy", DLCPreloadFile, "data/sound/enemy")
scan_wavs_directory (env, "cellpop_i/data/sound/environment", DLCPreloadFile, "data/sound/environment")
scan_wavs_directory (env, "cellpop_i/data/sound/event", DLCPreloadFile, "data/sound/event")
scan_wavs_directory (env, "cellpop_i/data/sound/npc", DLCPreloadFile, "data/sound/npc")
scan_wavs_directory (env, "cellpop_i/data/sound/object", DLCPreloadFile, "data/sound/object")
scan_wavs_directory (env, "cellpop_i/data/sound/player", DLCPreloadFile, "data/sound/player")
scan_wavs_directory (env, "cellpop_i/data/sound/ui", DLCPreloadFile, "data/sound/ui")
You'll notice it's sort of arbitrary and imperfect, for instance I have the code to build the main game sound commented out, the PreloadFile2 is a text file that the SCons script actually builds, i.e., this is not passed to SCons somehow, and so on. Well here's the thing... so what? None of that hurts or confuses me, I have one file to edit, and I cannot stress this enough— it's the whole thing, it's just Python code, I can easily change it or refine the rough parts, if it ever matters, which it hasn't, which is why its like that. (All this stuff is raw cut and paste from real projects.)
2 (FYI, this file is used by the game to load sound effects at startup, rather than on demand, so they always play immediately)
Anyways-- I'd say I'm done here!