
Here is a short brainstormy introduction I wrote about lazy specialization to whet your appetite: http://lukepalmer.wordpress.com/2009/07/07/emphasizing-specialization/
Quoting from your linked page: "...The key point about data structures is that they can be decomposed; you can peel off the root of a tree and leave yourself with only one of the branches, and the other branch will be garbage collected. A lazy specializer promotes full-blown functions to the level of data structures. Functions can now be decomposed (by composing!) in the same way, sharing important pieces and forgetting unimportant ones... ...removes the encouragement to fiddle with the details of a function for more speed... ...thus arbitrary functions separate us from the enclosed Behavior, about which we must selectively forget things. However, the lazy specializer has no trouble looking through those functions, and will modify the behavior anyway, ridding us of the dreaded space leak..." Analyzing structure (for optimization of speed and unconflating data) at the semantic layer can also eliminate space leaks, which was an area of work I was suggesting here: http://www.haskell.org/pipermail/haskell-cafe/2009-November/068638.html "...Perhaps it is possible to categorize and generalize many of the types of structures that cause space leaks, then handle them at the semantic layer..."