Perhaps your instances will work correctly with this data declaration?
Perhaps it might. But that misses an important point. The biggest impediment to developing large robust applications with Haskell is the opacity of its performance model. Haskell is fantastic in very many ways, but this is a really serious difficulty. I can make a seemingly slight change to my program and the performance changes dramatically. What's worse, the connection between the cause of the blowup and place where it is observed can often be quite subtle[*]. There's a classic example of two one line haskell programs, one of which uses O(1) stack space and the other O(n) stack space, even though they compute the same result, and which are so similar, you have to stare at them for five minutes before you can spot the difference. Hughes' "Why functional programming matters" argues [rightly] that lazy FP provides a better "glue", to allow greater abstraction at the semantic level. The flip side, which IIRC, he doesn't mention is the opacity of the performance model. Here's a question for the experts. What generalizations can I make about the performance of lazy functions under composition? In particular, if all my individual functions are well behaved, will the program as a whole be well behaved? cheers, Tom [*] Gosh, this is beginning to sound like a diatribe on the evils of pointers and manual memory management in C. Interesting....