
Neil Mitchell wrote:
The problem is that something like GHC is very complex, with lots of transformations. When transformations are firing other transformations, which in turn fire other transformations, it doesn't take a great deal to disrupt this flow of optimisation and end up with a result which doesn't accurately reflect the particular change you made. Better knowledge of the end effects on a program isn't necessarily going to translate to better knowledge of the optimisations effect.
Maybe if we had a greater number and variety of optimising compilers we'd be able to more freely experiment with optimisation techniques in different settings. At the moment GHC is all there is (with Jhc not ready for mainstream use yet)
Thanks
Neil _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Although there may not be a lot of optimizing Haskell compilers, there are compilers for languages similar to Haskell that consistently perform well. One could point to O'caml or others in the ML family, or even more interesting is the case of Clean, whose syntax heavily borrows from Haskell. What do the Clean folks do that has made their compiler so consistently competitive? Is it the abc machine? Frankly I'm amazed that a three stack based virtual machine can be translated into such efficient machine code in register centric CPU architecture. Can Haskell compiler writers learn something from this? Supposedly someone is working on a Haskell compiler that will use the clean compiler back end. I can't believe that Clean is so fundamentally different, even with uniqueness types, that it has an edge in compiler optimization. -- View this message in context: http://www.nabble.com/Re%3A--Haskell---Fwd%3A-Re%3A-Computer-Language-Shooto... Sent from the Haskell - Haskell-Cafe mailing list archive at Nabble.com.