
Am Donnerstag 04 Februar 2010 16:32:24 schrieb Enrique MartÃn:
Hello all,
few days ago I made some experiments with Haskell type classes. I wrote a small Haskell program for searching in sorted lists, defining my own type classes for equality (MyEq) and order (MyOrd) so that they only have one member function:
<snip code>
I made some tests in GHC 6.8.2 and I noticed that the original program with type classes runs pretty faster than the translated program. For example, reducing the expression search (S Z) (replicate 1000000 Z) needs 2.07 seconds in the original program. However the translated expression search dictMyOrdNat (S Z) (replicate 1000000 Z) needs 3.10 seconds in the translated program, which is one more second.
Surprised with the results, I repeated the test this time in Hugs Sept. 2006. I noticed that the difference was not so big: search (S Z) (replicate 100000 Z) --> (2100051 reductions, 2798068 cells, 2 garbage collections) search dictMyOrdNat (S Z) (replicate 100000 Z) --> (2200051 reductions, 2898067 cells, 3 garbage collections)
My first idea was that type classes were implemented using the approach of dictionaries, but the test showed me that it is not true (mainly in GHC).
It is the approach used by GHC (you can see it by looking at the core you get with the flag -ddump-simpl). The point is that you ran the code interpreted. Now, dictionary-passing for type classes is baked into the compiler, it's rather good at it even on interpreted code, while if you implement it yourself, you get ordinary function calls, looking up the comparison function on each iteration, probably. The difference disappears if you compile the code, at least with optimisations.
Then I discovered the paper "Implementing Haskell overloading", Augustsson 1993, when he describes some ways to improve the speed of Haskell overloading.
So my questions are: 1) is the enhancement obtained only using the optimizations of Augustsson's paper? 2) Could anyone tell me where I can find the translation of type classes that GHC and Hugs use?
Thank you very much,
Enrique M.