Building with Make

Ah, the classic build method. Of course, first you will want to make sure you understand the basics of building unit tests with Unity. Got that covered? Great! Let's do this thing!

Make is the most widely used build system out there. Its cryptic syntax and finicky behavior is "loved" by all, right? In any case, a large portion of the Unity community uses Make to manage their builds. I've put together a decent scalable Makefile that I'd like to dissect with you today. Hopefully many of you can just use it as is… and for everyone else, at least it can be an example of how things go together, right?

Today's makefile is for those who are focused on using Unity and aren't going to use Ruby in their build process. This means they won't be using CMock (which is written in Ruby) and won't be using any of the provided helper scripts.

This makefile makes a few assumptions. These assumptions are fairly easy to meet if you're working with a brand new codebase, but if you're retrofitting an existing project with unit testing, it might be a little more challenging. Venturing away from these assumptions isn't going to break everything… it just means you are going to have more tweaking to do.

ASSUMPTION 1: DIRECTORIES

We're going to assume that you have a single directory which contains all your release source and a second directory which contains your tests. If you have any other configuration, you'll need to do a little tweaking. Our project's directory structure will look something like this:

  • build - where all temporary stuff goes
  • src - where we have all our source code for release (and to be tested)
  • test - where we have all our unit tests
  • unity - where we have copied the latest copy of the Unity project

ASSUMPTION 2: NAMING

We're assuming your tests are named things TestBlah.c, where Blah matches the file Blah.c. By sticking to a naming convention like this, it's easy for your tests to automatically be found. If you'd like something fancier, you're going to need to step up to rake or ceedling or something that allows you to map tests to specific files.

ASSUMPTION 3: TOOLCHAIN

We're assuming you're targeting a toolchain that is at least similar to the GNU tools. It should work well on Linux, Mac OS X, or even Windows if you have Cygwin or MinGW installed. If you want to use your target compiler and execute in a simulator, some modifications will be required, but nothing too bad.

START THE DISSECTION ALREADY

Alright. I'm going to walk through the makefile in sections so we can talk about it a bit. If you'd like to grab the entire makefile, you can get it here.

If you copy/paste any of the lines from this article, keep in mind that GNU makefiles like tabs, not spaces. You'll also have to watch your line endings.

DETERMINING THE HOST

Our first step is to learn a little bit about our host operating system. If we're on a Linux / BSD / Unix / whatever type of OS, we're going to want to use commands like "rm" while if we're working with Windows (even with MinGW), we're going to want to use "del".

ifeq ($(OSTYPE),cygwin)
	CLEANUP=rm -f
	MKDIR=mkdir -p
	TARGET_EXTENSION=out
else ifeq ($(OS),Windows_NT)
	CLEANUP=del /F /Q
	MKDIR=mkdir
	TARGET_EXTENSION=exe
else
	CLEANUP=rm -f
	MKDIR=mkdir -p
	TARGET_EXTENSION=out
endif

MAKING PATHS

The next step is that we want to collect all our paths together in one convenient place. We've built this example in one of my favorite layouts… but you should be able to figure out how to redirect things if your layout looks different. We're using relative paths here. Whenever possible, you should stick with relative paths… otherwise you are inviting the wrath of anyone sharing your version control system with you. 

PATHU = unity/src/
PATHS = src/
PATHT = test/
PATHB = build/
PATHD = build/depends/
PATHO = build/objs/
PATHR = build/results/

BUILD_PATHS = $(PATHB) $(PATHD) $(PATHO) $(PATHR)

Note the last line. We don't need to worry about our build directories and its subdirectories yet. We're going to make those directories on the fly if they don't exist. Why all those subdirs? We like break up our work into subdirs because it makes our life so much easier when debugging is necessary.

THE SOURCE

At this point, we're tempted to start locating all our source files next. But this isn't actually necessary. In a moment, we're going to tell our compiler where to look for source. But there is no reason to tell the compiler about each individual file. In fact, we're instead just going to tell it where to look for all the test files. This takes one line. 

SRCT = $(wildcard $(PATHT)*.c)

THE TOOLCHAIN

The next step is to configure the toolchain we want to use. In our case, we're sticking with a very straightforward use of GCC. We could obviously get much fancier with this, and you should feel free to modify the arguments as needed for your application.

One thing worth noting, even though we are using gcc for compiling, linking, and dependency tracking, we've broken them out into separate commands. This serves three purposes:

  • It's easier to change what options you pass to each when you want to customize this.
  • It's easier to switch to a different toolchain, which likely uses different commands for these things
  • It's self-documenting when we get to the rules below.

This section ends up looking something like this:

COMPILE=gcc -c
LINK=gcc
DEPEND=gcc -MM -MG -MF
CFLAGS=-I. -I$(PATHU) -I$(PATHS) -DTEST

The CFLAGS are particularly interesting. As you can see, we are adding our immediate directory, the Unity directory, and the source directory to the include path. If you have source in multiple directories, this is the place to add them. We are also defining the word TEST. This is very convenient to have during tests, just in case there are minor changes that are necessary to your release code. We recommend avoiding the use of this in most cases, but there are going to be instances where this is your only option.

RULE 1: SUMMARIZING RESULTS

Finally! We've reached the actual rules that do most of the heavy lifting. We start with the rule that executes last… It triggers the chain reaction of rules which cause everything to be built and linked and tested. By the time all those things have happened, the only thing left for this rule to actually do is summarize what happened. It does this by peeking inside our results files with grep and dumping the ignored and failed test results for us.

test: $(BUILD_PATHS) $(RESULTS)
    @echo "-----------------------\nIGNORES:\n-----------------------"
    @echo `grep -s IGNORE $(PATHR)*.txt`
    @echo "-----------------------\nFAILURES:\n-----------------------"
    @echo `grep -s FAIL $(PATHR)*.txt`
    @echo "\nDONE"
    

RULE 2: CREATING RESULTS

Where do those results files come from? They are the result of running the test executables and piping the output to files. If you are running a simulator, this is where you would replace the direct execution with a call to your simulator.

RESULTS = $(patsubst $(PATHT)Test%.c,$(PATHR)Test%.txt,$(SRCT))

$(PATHR)%.txt: $(PATHB)%.$(TARGET_EXTENSION)
    -./$< > $@ 2>&1

There are a few cryptic things going on here. First, let's look at the first line. Here we are looking for all the TestSomething.c files in our SRCT directory, and we're inventing a file named TestSomething.txt in the results folder. This collection of results files is triggered by our previous step and will drive the rest of the process.

Second, note that our target extension depends on the host platform we're running. A Windows machine likes .exe files while a Unix-like machine is going to be happy with .out files.

Next, what is up with that last line? It is quite possibly the most cryptic makefile line ever written, isn't it? Let's peel it apart:

  • The ./$< means we should directly run the executable at the command line. Really. If you've played on a unix-like system before, you're probably familiar with running applications like ./run_me. This is the same. The $< part is make's way of saying to insert the first dependency. Our first dependency is the executable, so we're telling it to run our executable. So what's the rest of the line?

  • > means we should pipe the output data from this command somewhere.

  • $@ tells us where. It's makefile shorthand for "the name of this rule". In this case, the name of this rule is the path and name of the results file we want to build.

  • 2>&1 finally tells Make that we want to combine our stderr and stdout output together over stdout. This is a convenience so that we get everything Unity might want to tell us in our results files.

RULE 3: CREATING EXECUTABLES

As you probably guessed, our executables are made up of object files. We could create a rule that directly derives our executables from C files, but by breaking it up, we only rebuild the things that need rebuilding. This happens in a single line.

$(PATHB)Test%.$(TARGET_EXTENSION): $(PATHO)Test%.o $(PATHO)%.o $(PATHO)unity.o $(PATHD)Test%.d
    $(LINK) -o $@ $^

If you kept up with the last couple of rules, you should understand most of this one. The only new piece is the ``$^`` at the end. This is makefile-speak for "all the dependencies." So we're saying that our Test executable is a combination of the following object files: Our Test object file, the source file named the same thing (minus the test), Unity itself, and the dependency file. We're going to pull in the additional dependencies generated by the dependency file in a moment.

RULE 4: CREATING OBJECT FILES

Next, we tell Make how to compile the C files into object files. We do this a few times because we want it to understand that there are multiple source directories it might want to pull from. This includes the test source, release source, and Unity source directories.

$(PATHO)%.o:: $(PATHT)%.c
    $(COMPILE) $(CFLAGS) $< -o $@

$(PATHO)%.o:: $(PATHS)%.c
    $(COMPILE) $(CFLAGS) $< -o $@

$(PATHO)%.o:: $(PATHU)%.c $(PATHU)%.h
    $(COMPILE) $(CFLAGS) $< -o $@ 

Notice that these rules are all double-colon rules "::" where all our previous rules were single-colon rules. What's that about?

The double colon tells Make that this rule is terminating. Therefore, if it can't find a C file associated with one of these rules, it should consider that to be a problem. We have not included any rules for autogenerating any of these C files, so they should be around somewhere. 

RULE 5: DEPENDENCIES

Finally the dependency files are easily created with a single rule as well. 

$(PATHD)%.d:: $(PATHT)%.c
    $(DEPEND) $@ $<

Our dependency file is the first argument. The C file it's based on is passed in second. This is a good time to back up and talk about the options we passed to gcc for this call:

  • -MM tells gcc to output header dependencies for the compile file(s), but only those that are in single quotes (it will exclude system headers).

  • -MG tells gcc that it's okay if it runs into headers that it can't find. This is going to be all of the headers, honestly, because we also (purposefully) haven't told gcc anything about our include paths. We're looking for very shallow dependency tracking: just the files that are included in the test file.

  • -MF tells gcc we want the header dependencies to be written to a file. The next argument should be the name of the file to write to, which is why we have placed the dependency file first in our rule above.

EXTRAS

At this point, we've covered all the parts that are necessary for building, running, and summarizing our results. But there are a few things we are going to want to add to make this all more useful:

DIRECTORIES

It's really handy to have our build directories created for us, so that we don't have a failed build just because we didn't set aside the place for it. This is done with this series of rules:

$(PATHB):
    $(MKDIR) $(PATHB)

$(PATHD):
    $(MKDIR) $(PATHD)

$(PATHO):
    $(MKDIR) $(PATHO)

$(PATHR):
    $(MKDIR) $(PATHR)

CLEAN

Sometimes something will go wrong with your dependency tracking… it might be related to version control. It might be an oversight in our system. In any case, it's handy to wipe things out and know you are working with a clean slate from time to time. For that, you want to be able to make clean.

clean:
    $(CLEANUP) $(PATHO)*.o
    $(CLEANUP) $(PATHB)*.$(TARGET_EXTENSION)
    $(CLEANUP) $(PATHR)*.txt

KEEPING THINGS AROUND

If you run our makefile at this point, you might notice that all the temporary files are deleted when it is finished. ALL of them. This includes our object files and dependency files. It includes our test executables. It even includes our results files! Every time we run our tests, Make is starting over on the entire process.

Clearly, this isn't going to be ideal as our codebase grows. It seems like we'd really prefer to only rebuild things that have changed… or at least only delete everything when we specify clean!

To do this, we tell Make that these intermediate files are important to us. We do that by using the built in keyword .PRECIOUS.

.PRECIOUS: $(PATHB)Test%.$(TARGET_EXTENSION)
.PRECIOUS: $(PATHD)%.d
.PRECIOUS: $(PATHO)%.o
.PRECIOUS: $(PATHR)%.txt

FINISHING TOUCH

Finally, we want to keep some of the keywords we've created from getting executed EVERY time we run make. To do that, we need to admit to Make that they don't actually make files named after themselves. 

.PHONY: clean
.PHONY: test

Without this, typing ``make`` is going to assume you meant ``make clean test`` or ``make test clean`` because it doesn't see a file named ``test`` or a file named ``clean``, so they MUST need rebuilding, right?

That's clearly not what we want… so we tell Make that they are phony tasks… which is sort of like an alias. It lets us specify them as a starting point for what to build, without requiring them.

WRAPPING IT ALL UP

So this was a longer article than I intended. Hopefully it gives you enough information to build your own makefiles and get your tests running well. Best of luck!

You can get this makefile from [here].

 

This Article Originally Published on Mark's Website