
Hi Malcolm, On 2017-11-18 at 11:09:28 +0000, Malcolm Wallace wrote: [...]
But surely the timing for a full build from scratch is not the most important thing to compare? In my work environment, full builds are extremely rare; the common case is an incremental build after pulling changes from upstream. Is this something you can measure?
If Hadrian is more exact about the dependency tracking, I'd expect something close to a full rebuild when I `git pull` from GHC HEAD, since the Git commit and the snapshot versioning we infer from that, pervasively transcends most most artifacts of GHC, and in general you should boot & configure everytime you `git pull` unless you're sure it won't matter. You'd have to suppress/mask this logic if you want to avoid full rebuilds. Also I'm not sure how Hadrian tracks itself as a dependency (NB: The GNU Make system doesn't); When I used Shake myself, I remember that meta-depending on the rules per se wasn't a trivial thing to do; and the simplest way was to introduce very coarse (either manual or by hashing) global versioning over all rules, which would invalidate the full build. But it's been some time since I did that, so I may be wrong here. However, what I think would be a more relevant benchmark matching the usual GHC developer workflow, would be to see how well Hadrian manages to minimize the work needed to rebuild GHC after editing a source file in GHC's source-tree without changing the Git commit. As that's what matters most to me when I'm actively working on a GHC patch. I'll try to measure/compare this with the next GHC patch I hack on.