
11 Dec
2008
11 Dec
'08
10:18 a.m.
The program below computes (f 27) almost instantly but if i replace the definition of (f n) below with (f n = f (n - 1) * f (n -1)) then it takes around 12s to terminate. I realize this is because the original version caches results and only has to calculate, for example, (f 25) once instead of (i guess) four times. There is probably a good reason why this isn't caught by the compiler. But I'm interested in why. Anyone care to explain?
main = print (f 27)
f 0 = 1 f n = let f' = f (n-1) in f' * f'
(compiled with ghc --make -O2) Mattias