
At 19:50 04/06/03 -0400, Derek Elkins wrote:
foldl (++) [] [ combinations n as | n <- intRange 1 (length as) ]
*cries out in pain and horror* fold_l_ (++) over combinatorially large lists! (++) has gotten a reputation for being slow. (++) isn't slow in and of itself, even using it a lot isn't slow, what -is- slow is using it left associatively. What happens then is that (a++b)++c builds a copy of a then tacks on b, then it builds a copy of a++b and tacks on c. In this case we've copied a twice when we should have only copied it once. Obviously for ((a++b)++c)++d it'll be copied three times and b twice and so forth. To add insult to injury, there is already a standard function that does what you want, concat, which is defined as foldr (++) [] in the report. In fact, you could rewrite the whole thing as concatMap (flip combinations as) [1..length as]. A list comprehension with only one source and no filters is the same as a map.
Thank you. I stand duly instructed... this commentary was the kind of feedback I was hoping to draw, though I did not mean to cause so much anguish :-)
You can also write [1..length as] rather than use the intRange function, which looks prettier. :-)
I agree with Liyang that [1.. ] is much prettier. That's one idiom I've yet to absorb. (I came to functional programming thinking that it was fundamentally simpler than conventional languages -- none of those complicated control structures to worry about, just expressions -- but the syntactic richness of Haskell seems to be quite beyond the conventional languages that I have used.)
Indeed, I think I've used length all of 3 times.
This is an interesting comment. I've found myself using length quite often, even when it feels not-quite-right to me. Maybe it's that I'm not yet used to designing algorithms functional-style, or alternative idioms that I'm overlooking. I'm not sure what I'm missing here, so this is a vague probe for further insight.
You (Graham) also have some parentheses issues; e.g. in foo ++ (combinations 5 l) the parentheses are superfluous.
I'm tempted to argue that being superfluous doesn't mean they shouldn't be there. This isn't just a functional programming issue... I find that when there are many levels of operator precedence it's easier to be explicit than to try and remember what they all are -- as much for reading the code as writing it in the first place. But maybe I'm still reading functional code in the wrong way? (I still scratch my head over some of the prelude/library functions, though it's getting easier.)
(++) is slow though in that seemingly innocent uses can become n^2. A simple example is displaying a binary tree. A tree like /\ /\ will cause left associative uses of (++). Hence the prelude type ShowS = String -> String and shows :: Show a => a -> ShowS. The problem is we don't want left associativity, so what we do is make a context that says do everything else to the right, then this, e.g. "Foo"++everythingelse. This is simple enough to do with ("Foo"++). (As a side note, using the Writer/Output monad with the list/string monoid is probably not the best of ideas, instead you can use the function monoid and tell ("foo"++).) You can see that this technique is more or less just adding an accumulating parameter as you'll have to provide the 'initial' value.
I'd just about figured the ShowS idea, but I've yet to get a handle on this
idea of [a] 'monoid'.
#g
-------------------
Graham Klyne