Based on your stated background, the best start would be the (longer)
paper on the Spineless Tagless G-machine [1]. It describes how graph
reduction is actually implemented efficiently. Since then there have
been two major changes to this basic implementation: Use of eval/apply
(a different calling convention) [2] and constructor tagging [3]
(which drastically reduces the number of indirect branches from the
original STG approach).
In addition to this fairly low-level stuff, there are very powerful
optimisations performed at a higher level. For a taste see stream
fusion [4].
If you're done with these, feel free to ask for more. :)
[1]: http://research.microsoft.com/apps/pubs/default.aspx?id=67083
[2]: http://research.microsoft.com/en-us/um/people/simonpj/papers/eval-apply/
[3]: http://research.microsoft.com/en-us/um/people/simonpj/papers/ptr-tag/index.h...
[4]: http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.104.7401
On 10 January 2012 17:25, Steve Horne
Although I'm far from being an expert Haskell programmer, I think I'm ready to look into some of the details of how it's compiled. I've a copy of Modern Compiler Design (Grune, Bal, Jacobs and Langendoen) - I first learned a lot of lexical and parsing stuff from it quite a few years ago. Today, I started looking through the functional languages section - I've read it before, but never absorbed much of it.
Graph reduction, lambda lifing, etc - it seems pretty simple. Far too simple. It's hard to believe that decent performance is possible if all the work is done by a run-time graph reduction engine.
Simon Peyton Jones has written a couple of books on implementing functional languages which are available for free download. At a glance, they seem to covers similar topics in much more detail. However, they're from 1987 and 1992. Considering SPJs period "of despair" when he couldn't get practical performance for monadic I/O, these seem very dated.
Some time ago, I made a note to look up the book "Functional Programming and Parallel Graph Rewriting" (I forget why) but again that's from the early 90's. I've also got a note to look up Urban Boquists thesis.
SPJ also has some papers on compilation - http://research.microsoft.com/en-us/um/people/simonpj/papers/papers.html#com... - and the papers on optimisation by program transformation have caught my eye.
Are there any current text-books that describe the techniques used by compilers like GHC to generate efficient code from functional languages? It's OK to assume some knowledge of basic compiler theory - the important stuff is code generation and optimisation techniques for lazy functional languages in particular.
Also, what papers should I read? Am I on the right lines with the ones I've mentioned above?
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
-- Push the envelope. Watch it bend.