
Sven Panne wrote:
Am Samstag, 27. August 2005 15:02 schrieb Seth Kurtzberg:
[...] I would suggest that, while configure does solve a problem, it isn't the best way to solve the problem. A properly abstracted and layered implementation of O/S specific calls, with each environment supported by an implementation file, is much closer to "doing the right thing."
Well, I don't want to start a Jihad regarding the usefulness of autoconf, the autoconf documentation itself contains a rather good explanation why testing features is far superior than assuming a fixed (and probably much too small) set of platforms in advance.
I only want to point out that the autotools solve problems which go *far* beyond anything which could be achieved by writing simple abstractions for platform features: It can find out if your compiler/linker/library/header/... has a certain bug (the autoconf macros are full of examples for every category), which version of an API (which might have changed in a non-backwards-compatible way, see e.g. OpenAL) is actually contained in a library/header, which dozens of (often proprietary) linker options are needed to use a certain feature, how to create and use a dynamic library, etc. etc. Simply writing an abstraction layer would solve none of the problems mentioned above. Of course all these problems are bad and should not be there at all, but simply ignoring them means closing one's eyes before the current "state-of-the-art" in real-life computer science. And for a casual user, these are all *hard* problems! Trying to solve these problems without autotools, one usually ends up re-inventing the wheel (i.e. writing autotools-like code), but probably much, much worse (see e.g. qmake).
[...] I do realize that this position is more or less tilting at windmills.
I'd really be happy to learn how the problems mentioned above could be solved without autotools or basically re-inventing autotools, seriously. I hate writing obscure lines in M4 and sh probably as much as you do, but I can't see a viable alternative. Rewriting all this stuff (plus all the utilities used in the macros!) in Haskell doesn't look very attractive and realistic...
I'd have to turn the question around. In several major projects, I've never come across a situation where any of the autoconf hacks are necessary. I wouldn't reinvent autoconf. If I needed it, I would use it. I just have never needed it. The problem with autoconf is that you have no idea, watching it run, which of the many things it tests are actually used. It does all the same tests for all programs. I did apply a tool to two large projects that automatically generated autoconf support. It worked fine, but then since the code compiled just fine without it, that doesn't really show anything one way or the other. The without autoconf code was built on linux, freebsd, netbsd, solaris, sunos, SGI's UNIX, and HP's UNIX, as well as, of course, win32. Of course there was no win32-autoconf issue, because there was no autoconf at all; just three files copied. The UNIX systems tested is a subset, but a fairly large subset. It is possible that there are issues that weren't exposed by that set of UNIX environments, but I haven't had any reports of this. The interface files were, total, about 200 lines. The differences among them were minor, but there were differences.
Cheers, S.