"""For a start, fabricate won't say "* missing separator" if you use spaces instead of tabs.""" - No, but it will say "IndentationError: unindent does not match any outer indentation level" if you mess up Python's indentation :-)
You don't need to enter in dependencies with 'make' either (suffix rules or makedepend). Don't get me wrong, fabricate looks great, but let's not misrepresent the power of make in the right hands. From the docs:
files.o : files.c defs.h buffer.h command.h
cc -c files.c
I don't know a programmer beyond the introductory level that still writes this in their make file. :-p
Right. Make already knows how to convert .c to .o, all you need to say is
files.c: defs.h buffer.h command.h
and there are tools such as makedepend (for C and OCaml, off the top of my head) that generate those listings automatically, so you just add this to the Makefile:
include (autogenerated dependency listing file)
While the mandatory tab in Make syntax is a genuine wart, make has been around long enough that there are several other tools that know how to work with it.
(Now, having the scan for access time changes added as a warning to Make might not be a bad idea. "WARNING: You forgot this dependency")
> make has been around long enough that there are several other tools that know how to work with it.
I'd prefer an improved from-scratch 2nd-generation tool instead of using one that simply wraps around a complicated 1st-gen one that I'd rather not touch.
Make has more warts than just the tab characters. Lots of obscurely-named variables, for starters.
You might want to check out tup. It uses the dependency tree in a fundamentally different way from make, in order to improve build times. http://gittup.org/tup/
> [complicated tool ...] Lots of obscurely-named variables, for starters.
Are you talking about make or just GNU make? The $< $* $$@ stuff comes from gmake. Make itself is pretty simple. Here's a good tutorial: http://www.freebsd.org/doc/en/books/pmake/
It looks like they hardcoded gcc as the compiler in the example. What about trying to port a program to a platform without it (or that calls it differently), that needs extra compilation flags, etc.?
A lot of the seemingly esoteric stuff in Makefiles is there to make porting feasible (and the rest is autogenerated by autoconf, automake, etc.). Not just which files an operation depends on, but e.g. the options used for building shared libraries correctly on this platform, paths for libraries, etc. While the build system will remember what it found as dependencies, it still needs to know what to find the first time. (It will note header files as dependencies, though.)
I have a machine with two quad-core CPUs, I want to run 8 parallel jobs at a time!
Most of these make replacements start with design decisions (like using Python) that pretty much rule out ever reaching what make does best, and which is essential to get my build times improved by at least a factor of 6.
Using a tool (like CMake) to generate your Makefiles is an excellent solution, heck, even a Python script that generated a Makefile and then ran make would be to prefer over this tool.
Using Python does not prevent parallel execution of tasks. It's just the build tasks graph that gets created by a single process, but then the build system can use all the available CPUs to actually execute it. See e.g. -j flag to SCons http://www.scons.org/doc/production/HTML/scons-user.html#AEN....
As I said, generating a Makefile (i.e. the dependency graph) from Python would be fine, which conceptually sounds like what SCons do (from your comment).
But look at the build file example on the fabricate page, it is a user supplied build file with a build function which calls a compile function which loops over sources. Having the fabricate system take input like that and do parallel execution is as hard as solving the halting problem.
Fair call. So far that hasn't been a problem for us -- but if you want to make it one, please donate us some cores! :-)
Seriously, as vr said, Python's not stopping us. And I imagine it wouldn't be too hard to add code to spawn off a few background tasks based on the dependency list.
We've been using memoize to build stuff for a while now, but we needed something that worked under Windows too. So here's our "fabricate" build tool. It's BSD licensed -- enjoy!
Does it scan your entire filesystem for atimes so as not to miss the .h files over in /usr/include, and if not, how does it know which directories need scanning?
Yes, via access times and strace: "When this build stage ran, it checked these files, so remember them as dependencies." Not magic at all, and should automatically work for languages that haven't even been invented yet.
I've hacked it to print the actual file names on CreateFile() calls, but it's failing because it doesn't recurse and trace sub-processes. It should be theoretically possible by hooking CreateProcess(), but it's probably hard.
It uses strace, a Linux-specific file change notification hook. It's not available on BSD or (IIRC) OS X, though ktrace is, so it could conceivably be ported. (The page says something vague about cygwin having a comparable hook to strace. I haven't developed for cygwin, though.)
It looks like it checks for files modified specifically by its subprocesses.
Do I miss something? That example is like three times longer than an equivalent Makefile would needed to be, and I need to know Python, obviously. For a trivial project, I guess that Make would be easier to use, and for something more complex, I think I would have to do quite a lot of programming, with Fabricate…
I don't get it either. This is the second build system done in python I come across (the other would be SCons) and both are utter failures.
Make has its quirks, lots of them. But it wouldn't be the most popular build-tool in the world if it was all wrong. So, my hint to the next guy trying to reinvent it: Take a very close look at Make first and then improve on it. Don't start in a clean-room because then you're doomed to repeat build-system history, including all the past mistakes (edit: unless you're Linus Torvalds, perhaps).
Also the build-system is an area where a sane DSL makes perfect sense. Don't make me write procedural code by default. I don't want to see lines like the following in my build-scripts:
What misses in my opinion is 'build by convention'. Most things that you want to build have recipes that are mostly the same. You have some sources that you want to compile and then link against some libraries. Using some platform specific compiler and linker flags.
So, instead of writing a program to do the build I would rather see a simple DSL that allows me to specify just the minimal amount of configuration to get things going.
For example building the example on the fabricate wiki page, should not be more complicated than:
:program programname : program.c utils.c
(This is actually an a-a-p recipe that will do the same. Throw this in a main.aap file and you get everything that fabricate does, build, dependency checks, clean, etc.)
I don't mind to script the build when things get more complex. But only then and only when there really is no simple built-in alternative.
Fabricate seems to be more like ant in the Java world. Where you specify in detail what your build should do. Most people have agreed by now that it is a waste of time and instead use Maven, which completely works on simple conventions and standard rules.
Well, our experience says that as soon as you step outside that "minimal amount" you have to write all the build steps explicitly anyway, so we're specifically avoiding implicit rules. And we end up being explicit pretty quickly in all our projects -- maybe more than some folks because we're embedded developers and tend to need full control, but even so.
To be fair, the equivalent Makefile with autodeps generation (using gcc's -M options) is about the same length.
However, as soon as you start adding more complex stuff the fabricate gets much easier to work with and maintain. Make's $< $@ stuff, but also because you're forced into a targets-are-files mould, and because you don't have a real programming language to work in.
I wouldn't claim it's the full story, but IMO part of what's fundamentally broken is the phenomenon of single-language programmers. This leads to the belief that your entire tool chain needs to be written in a single language. So now every language has it's own build tool (or 2, or 3), and god help you if you have a multi-language codebase and need to have a unified build system.
I think it's just down to the popularity of make. People who are starting new projects use make because "everybody's using it, it must be the best!" Then, when make becomes a pain in the butt, they write their own because they've already decided that there's nothing better out there.
I wish someone would pick up http://a-a-p.sourceforge.net/ again. It is a great build tool written by the Vim author. I've used it to build complex projects in the past with a minimal amount of aap configuration. It is much less verbose than fabricate.
The problem with this and other Python build tools is They are a huge pain to use on older systems that either don't have a recent enough version of Python installed or don't have Python installed at all. It's just not as portable as Make is in that respect.
The central problem with these sorts of "make replacements" is they miss out on the whole thing that makes make useful. I care not a whit about implicit rules, and I would welcome a real language over GNU make's function apparatus (filenames with spaces in them tend to get split in the functions, due to the fundamentally macro style), but without pattern matching rules and a real dependency engine I won't even think once about it. I use both far, far too often, in almost every makefile I've ever written.
gchpaco, could you give an example of a situation where you'd use a real dependency engine, which could not be solved (at least not easily) by using fabricate.py? I know such cases exist in theory, but have never run across a need, so I'm interested.
Since fabricate makes Python, a full language available, it's not that there's things that cannot be expressed, it's really about ease. So I'll tell you about something I did when I was a grad student.
I was using an up to date TeX install. My advisor wasn't. To placate him, I had to copy all of my packages, and all the packages they referred to, and so on down the line so that he could compile my papers. I did this entirely within make + some small shell scripts. The fact that I could write scanners for %.tex and %.sty and %.cls and have GNU make (obviously) automagically use them to build the dependency scripts I instructed it to include was immensely convenient. This is not impossible to do procedurally, but it was inconvenient, and the all-make solution had the attractive side that whenever there were bugfixes to the packages they got automatically included in what I was doing.
A better example, come to think of it, is the typical LaTeX woe; you can't build the final .dvi until you have accurate page numbers, references, tables of contents, indexes. You get all of those by--building the .dvi. So there's a strange circular dependency here. Make doesn't deal with this especially well, although it can be done. Ant and SCons don't even try.