
On Oct 20, 2010, at 2:23, Daniel Fischer wrote:
On Wednesday 20 October 2010 01:23:49, Bastian Erdnüß wrote:
Suppose I would want to write a function rmap that acts like
rmap [f,g,h] x == [f x, g x, h x]
Do I gain any advantage or disadvantage (beside readability) from using
rmap = flip $ map . flip id
over
rmap [] _ = [] rmap (f:fs) x = f x : rmap fs x
?
I could imagine Haskell can optimize the first version better since it refers to the built in map function. But beside that, does Haskell struggle with the combinatory stuff in the first version? Or does it get optimized away?
Well, you can ask GHC what it does with the definitions. Let's compile
module RMap where
rmap :: [a -> b] -> a -> [b] rmap = flip $ map . flip id
recmap :: [a -> b] -> a -> [b] recmap [] _ = [] recmap (f:fs) x = f x : recmap fs x
and look at the generated Core (obtained via -ddump-simpl).
Cool. Didn't know that.
The @ x_yz things are type annotations, otherwise Core is pretty close to Haskell. It takes a bit to get used to reading Core, but it's not that difficult (as long as the functions are short).
I see. Well, still better then my hand writing ;-)
First, without optimisations:
[some good explinations]
Thanks!
Now with optimisations. [...]
There, that looks much better than before (and for short lists, it performs much better, but for long lists, the difference should be negligible). flip, ($) and (.) have been inlined and eliminated, what we get is
rmap fs x = map (\f -> f x) fs
, which is really nice. In fact, it's even nicer than what we got for recmap, because the compiler knows map, there are rewrite rules, which can produce much better code when the function is used, for example, it's possible that the list of functions is completely eliminated and a use is rewritten to a direct loop instead of allocating a list cell for each function in the list. And it can profit from the rule
map f . map g = map (f . g)
which is much harder to detect for the direct recursion.
Good news, seems like Haskell has no problem with combinators. As I hoped I could expect.
And how "expensive" are pattern matches on the other side, anyway?
Depends on the pattern, of course. Matching "oh" against [] is much cheaper than matching "This is an expensive car" against
'T':'h':'i':'s':' ':'a':'n':' ':'e':'x':'p':'e':'n':'s':'i':'v':'e':' ':'p':'a':'t':'t':'e':'r':'n':' ':'m':'a':'t':'c':'h':_
But generally, pattern matching is quite cheap. The (well, one) advantage of higher order functions like map is that they're easier to optimise when they're used, given that somebody has taught the compiler how.
I understand. Many thanks again for pointing out how I can test such things myself with -ddump-simple. Cheers, Bastian