
I think I would like to make another note: when we talk about the complexity
of a function, we are talking about the time taken to completely evaluate
the result. Otherwise any expression in haskell will be O(1), since it just
creates a thunk.
And then the user of the program is to be blamed for running the program,
since that is what caused evaluation of those thunks :)
Abhay
2008/5/31 Martin Geisler
Tillmann Rendel
writes: Hi! (Cool, another guy from DAIMI on haskell-cafe!)
Another (n - 1) reduction steps for the second ++ to go away.
last ("o" ++ "l") A) ~> last ('o' : "" ++ "l")) L) ~> last ("" ++ "l") A) ~> last ("l") L) ~> 'l'
And the third and fourth ++ go away with (n - 2) and (n - 3) reduction steps. Counting together, we had to use
n + (n - 1) + (n - 2) + ... = n!
reduction steps to get rid of the n calls to ++, which lies in O(n^2). Thats what we expected of course, since we know that each of the ++ would need O(n) steps.
I really liked the excellent and very clear analysis! But the last formula should be:
n + (n - 1) + (n - 2) + ... = n * (n+1) / 2
which is still of order n^2.
-- Martin Geisler
VIFF (Virtual Ideal Functionality Framework) brings easy and efficient SMPC (Secure Multi-Party Computation) to Python. See: http://viff.dk/.
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe