
The example Henrik gave [...] models a composition, and is in general the strength of a combinator approach. But the strength of Applicative, in my opinion, is not composition but currying [...]
Well put, Paul.
I really do like the semantic model of Yampa. Signal transformers model
interactive behaviors, where the behaviors/signals of classic FRP model
non-interactive behaviors. (See
http://conal.net/blog/posts/why-classic-frp-does-not-fit-interactive-behavio....)
I also like currying.
As long as we use not just the arrow abstraction but also *arrow notation*,
I don't know how we'll ever be able to get an efficient implementation, in
which portions of computed signals get recomputed only when necessary. And
probably the Arrow abstraction itself is a bit too restrictive, given that
it disallows any conditions on its type arguments. So I've been noodling
some about formulations of signal functions that don't fit into the standard
arrow discipline.
Regards, - Conal
On Fri, Dec 19, 2008 at 6:31 AM, Paul L
Nice to see this discussion, and I just want to comment on the applicative v.s. arrow style. The example Henrik gave is
z <- sf2 <<< sf1 -< x
which models a composition, and is in general the strength of a combinator approach. But the strength of Applicative, in my opinion, is not composition but currying:
f <*> x <*> y
where f can have the type Behavior a -> Behavior b -> Behavior c. I don't think there is an exact match in arrows. One could, however, require sf to be of type SF (a, b) c, and write
z <- sf -< (x, y)
The tupling may seem an extra burden, but it's an inherent design choice of arrows, which builds data structure on top of products, and people can't run away from it when writing arrow programs.
-- Regards, Paul Liu
Yale Haskell Group http://www.haskell.org/yale _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe