Proposal: refactor Arrow class

I propose to refactor the Arrow class, so that GHC's arrow notation can be a bit more general. (The module Control.Arrow itself would remain standard Haskell.) Please comment by 12th August. In detail, the proposal is to change the Arrow class from class Category a => Arrow a where arr :: (b -> c) -> a b c first :: a b c -> a (b,d) (c,d) -- various functions made methods to allow efficient specializations by introducing a new class -- | A binary type constructor that is contravariant in its first argument, -- that is, -- -- * @'premap' id a = a@ -- -- * @'premap' (f . g) a = 'premap' g ('premap' f a)@ -- class PreArrow a where premap :: (b -> b') -> a b' c -> a b c with instances instance PreArrow (->) where premap f g = g . f instance PreArrow (Kleisli m) where premap f (Kleisli g) = Kleisli (g . f) and redefining the Arrow class as class (Category a, PreArrow a) => Arrow a where arr :: (b -> c) -> a b c arr f = premap f id first :: a b c -> a (b,d) (c,d) -- rest unchanged The proposed PreArrow class has more instances than Arrow, in particular composing a PreArrow with a Functor yields another PreArrow. The principal client for the new class would be the arrow notation, as implemented in GHC. With this class, there would be a simple rule for determining which instances are needed, based on the keywords used: * all commands ("proc" and operator arguments) need PreArrow * "do" needs Arrow * "rec" needs ArrowLoop * "case" or "if" need ArrowChoice One might object that the second argument of the PreArrow class is superfluous, and this is just a complicated Contravariant class. That is true, and in semantic terms a covariant functor would be sufficient, but Haskell's type system makes it impossible to have commonality between a Contravariant class of unary type constructors and the Arrow class used by arrow notation.

On Sun, Jul 15, 2012 at 1:22 PM, Ross Paterson
I propose to refactor the Arrow class, so that GHC's arrow notation can be a bit more general. [...] In detail, the proposal is to change the Arrow class from [...] by introducing a new class
class PreArrow a where premap :: (b -> b') -> a b' c -> a b c
There is a slightly more principled notion that offers up this operation, paired with an fmap-like operation on the second argument that is satisfied by all arrows -- a profunctor. http://en.wikipedia.org/wiki/Profunctor http://ncatlab.org/nlab/show/profunctor which you can find implemented here: http://hackage.haskell.org/packages/archive/profunctors/3.0/doc/html/Data-Pr... with misc. extras here: http://hackage.haskell.org/package/profunctor-extras http://hackage.haskell.org/package/representable-profunctors Dan Piponi wrote on the connection between arrows and profunctors at the end of: http://blog.sigfpe.com/2011/07/profunctors-in-haskell.html If we are going to introduce another superclass, I would much rather introduce that one, as it has a better theoretical motivation and the additional laws regarding dinatural transformations and the extra structure that follows. For instance, profunctor composition is both well defined and useful (as are profunctor collages and traces, and there is also a useful notion of representability for profunctors). http://hackage.haskell.org/packages/archive/profunctor-extras/3.0/doc/html/D... -Edward Kmett

On Mon, Jul 16, 2012 at 06:10:25AM +0100, Edward Kmett wrote:
There is a slightly more principled notion that offers up this operation, paired with an fmap-like operation on the second argument that is satisfied by all arrows -- a profunctor.
http://en.wikipedia.org/wiki/Profunctor http://ncatlab.org/nlab/show/profunctor
which you can find implemented here:
http://hackage.haskell.org/packages/archive/profunctors/3.0/doc/html/ Data-Profunctor.html [..] If we are going to introduce another superclass, I would much rather introduce that one, as it has a better theoretical motivation and the additional laws regarding dinatural transformations and the extra structure that follows.
My problem is that the application I have in mind (bare arrow notation) only needs contravariance, so requiring covariance of the other argument seems unnecessarily restrictive.

On 7/16/12 1:10 AM, Edward Kmett wrote:
If we are going to introduce another superclass, I would much rather introduce that one, as it has a better theoretical motivation and the additional laws regarding dinatural transformations and the extra structure that follows.
+1. -- Live well, ~wren

On 15/07/12 19:22, Ross Paterson wrote:
I propose to refactor the Arrow class, so that GHC's arrow notation can be a bit more general. (The module Control.Arrow itself would remain standard Haskell.)
When we start refactoring the Arrow class, might I humbly request that the `first` function and its friends be moved to a separate class that doesn't need `arr` or `premap`, but rather `arrIso`: class (Category a) => IsoArrow where arrIso :: Iso (->) b c -> a b c -- Laws: -- arrIso id = id -- arrIso f . arrIso g = arrIso (f . g) data Iso a b c = Iso { fw :: a b c, bw :: a c b } instance Category a => Category (Iso a) where ... Then the Arrow classes would be: class (ProductCategory a, PreArrow a) => Arrow a where ... -- a symmetric monoidal category, -- with (,) as product and () as identity class IsoArrow a => ProductCategory a where (***) :: a b c -> a d e -> a (b,d) (c,e) first :: a b c -> a (b,d) (c,d) second :: a c d -> a (b,c) (b,d) swap :: a (b,c) (c,b) swap = arrIso (Iso swap swap) -- Laws as usual: -- first f = f *** id -- second f = id *** f -- second f = first f . swap -- id *** id = id The reason for wanting this class separate is that bijections (i.e. Iso) and lenses can be an instance of IsoArrow and ProductCategory, but not of Arrow or PreArrow. Similarly, we could have a class before ArrowChoice: class IsoArrow a => SumCategory a where (+++) :: a b c -> a d e -> a (Either b d) (Either c e) left :: a b c -> a (Either b d) (Either c d) right :: a c d -> a (Either b c) (Either b d) left f = f +++ id right f = id +++ f Is either of these classes enough for the do/if sugar? Or do you also need (&&&) and (|||) for those? class ProductCategory a => DupCategory a where dup :: a b (b,b) (&&&) :: a b c -> a b d -> a b (c,d) f &&& g = (f *** g) . dup dup = id &&& id class SumCategory a => MergeCategory a where mergeEither :: a (Either b b) c (|||) :: a b d -> a c d -> a (Either b c) d f ||| g = mergeEither . (f +++ g) mergeEither = id ||| id Or the injections/projections? class ProductCategory a => ProjectCategory a where fst :: a (b,c) b snd :: a (b,c) c snd = fst . swap class SumCategory a => InjectCategory a where left :: a b (Either b c) right :: a c (Either b c) It seems that MergeCategory is enough for if statements: if x then y else z = (z ||| y) . arrIso boolIso . x where boolIso = Iso boolIn boolOut boolIn b = if b then Right () else Left () boolOut = either (const False) (const True) And I suspect that either ProductCategory, DupCategory or ProjectCategory is enough for do notation. And for recursive bindings: class ProductCategory a => TracedMonoidalCategory a where loop :: a (b,u) (c,u) -> a b c
in particularcomposing a PreArrow with a Functor yields another PreArrow
What do you mean by 'composition'? If you want to go the way of PreArrow, then PreIsoArrow would be: class IsoPreArrow a where isoPremap :: Iso (->) b c -> a b d -> a c d which is still slightly weaker than either IsoArrow or PreArrow. Or perhaps class IsoProfunctor h where ilmap :: Iso (->) a b -> h b c -> h a c irmap :: Iso (->) b c -> h a b -> h a c Of course IsoProfunctor is just another profunctor, categorically speaking. Just like SumCategory and ProductCategory are both monoidal categories. But you can't express that in Haskell 98/2010. That requires MPTCs: class Category g => Profunctor g h where lmap :: g a b -> h b c -> h a c rmap :: g b c -> h a b -> h a c class (Category h, Profunctor g h) => ArrowLift g h where arr :: g a b -> h a b arr = lmap f id There are some interesting relations between the classes (read (==>) as implication): SumCategory && PreArrow ==> MergeCategory. ProductCategory && PostArrow ==> DupCategory, where PostArrow is the other half of a Profunctor. Similarly: ProductCategory && PreArrow ==> ProjectCategory SumCategory && PostArrow ==> InjectCategory. I also think that now (IsoPreArrow || IsoPostArrow) && Category ==> IsoArrow (PreArrow || PostArrow) && ProductCategory ==> Arrow There is now a diamond in the superclasses of Arrow, one path goes through Profunctor/PreArrow, the other through IsoArrow. Finally, it might be sensible to disconnect SumCategory from MergeCategory, since the latter has an instance for lenses, while the former does not. And are there any types that are an instance of DupCategory but not of ProductCategory? But maybe I am wanting too much at once. :) Twan

On Mon, Jul 16, 2012 at 05:14:05PM +0100, Twan van Laarhoven wrote:
[...] Is either of these classes enough for the do/if sugar? Or do you also need (&&&) and (|||) for those?
For if or case you need premap and (|||). For do you need premap and (&&&). But if you have Category and premap, you can make (|||) from left, and (&&&) from first.
But maybe I am wanting too much at once. :)
That occurred to me too.

Is either of these classes enough for the do/if sugar? Or do you also need (&&&) and (|||) for those?
For if or case you need premap and (|||). For do you need premap and (&&&).
But if you have Category and premap, you can make (|||) from left, and (&&&) from first.
I showed that the weaker isoPremap is enough for `if`, and it also suffices for getting `(|||)` for `left` and vice versa. Since you are considering changing the class hierarchy, this is the time to also make other changes. Even if you don't use isoPremap or some of the other more general superclasses, it would still be good to have a nice hierarchy. And I do have two use-cases for these classes, namely isomorphisms and lenses. I am sure there are other examples of categories for which `arr` is too strong, but which do, for example, support `first`. The current `Arrow` class is too heavy weight. We already split of Category, but there are more things in between it and full arrows. Twan

On Mon, Jul 16, 2012 at 9:44 PM, Twan van Laarhoven
Is either of these classes enough for the do/if sugar? Or do you also need (&&&) and (|||) for those?
For if or case you need premap and (|||). For do you need premap and (&&&).
But if you have Category and premap, you can make (|||) from left, and (&&&) from first.
I showed that the weaker isoPremap is enough for `if`, and it also suffices for getting `(|||)` for `left` and vice versa.
Since you are considering changing the class hierarchy, this is the time to also make other changes. Even if you don't use isoPremap or some of the other more general superclasses, it would still be good to have a nice hierarchy.
And I do have two use-cases for these classes, namely isomorphisms and lenses. I am sure there are other examples of categories for which `arr` is too strong, but which do, for example, support `first`. The current `Arrow` class is too heavy weight. We already split of Category, but there are more things in between it and full arrows.
DSLs where you want to inspect the whole AST come to mind. Function arguments cannot be inspected, so 'arr' is often problematic, while allowing some specific functions is fine, as is manipulating tuples with 'first' and 'second'. Erik

On Mon, Jul 16, 2012 at 08:44:03PM +0100, Twan van Laarhoven wrote:
I showed that the weaker isoPremap is enough for `if`, and it also suffices for getting `(|||)` for `left` and vice versa.
Recalling that premap f g = arr f >>> g, the translation of if is proc e -> if b then c1 else c2 => premap (\ e -> if b then Left e else Right e) (c1 ||| c2) The point is that b isn't a Bool value, it's a Bool-valued expression in the variables in the environment e. Another sort of command one often uses is -<, with translation proc e -> a -< v => premap (\ e -> v) a Again, v is an expression in the variables in the environment e. We need premap to evaluate expressions with respect to the tuple of variables in the environment, and the whole point of arrow notation is to plumb that environment through the various constructs so that we can evaluate expressions against it (using premap). What use would it be otherwise?
And I do have two use-cases for these classes, namely isomorphisms and lenses. I am sure there are other examples of categories for which `arr` is too strong, but which do, for example, support `first`. The current `Arrow` class is too heavy weight. We already split of Category, but there are more things in between it and full arrows.
You're focussing on instances, but in addition to those, a useful interface also needs client functions.

On 7/16/12 3:44 PM, Twan van Laarhoven wrote:
I am sure there are other examples of categories for which `arr` is too strong, but which do, for example, support `first`.
My major complaint with arrows has always been that `arr` is way too strong. By requiring that we can embed every Haskell function into a given arrow we eliminate countless "arrows" which obey all the rest of the interface but which are specifically limited in not representing all Haskell functions. I'm thinking for example of DSLs. Ofttimes the whole point of a given DSL is that by restricting to a well-behaved class of functions we can get performance or other reasoning benefits. -- Live well, ~wren

On Sun, Jul 15, 2012 at 06:22:01PM +0100, Ross Paterson wrote:
I propose to refactor the Arrow class, so that GHC's arrow notation can be a bit more general. (The module Control.Arrow itself would remain standard Haskell.)
There was resistance to this on two grounds: - a Profunctor class would be more principled - if we're going to change Arrow, we should make other changes too I think the second set of changes is orthogonal, and should be considered separately. Regarding the first, it's certainly true that the original proposal was unsatisfyingly asymmetrical, but the PreArrow class is what I want to generalize most of arrow notation to, and Profunctor would be too much. So here's a revised proposal, which amounts to splitting the Profunctor class into two independent classes: -- | A binary type constructor that is contravariant in its first argument class PreArrow a where premap :: (b -> b') -> a b' c -> a b c -- | A binary type constructor that is covariant in its second argument class PostArrow a where postmap :: (c -> c') -> a b c -> a b c' and changing the Arrow class from class Category a => Arrow a where arr :: (b -> c) -> a b c first :: a b c -> a (b,d) (c,d) -- various functions made methods to allow efficient specializations to class (Category a, PreArrow a, PostArrow a) => Arrow a where arr :: (b -> c) -> a b c arr f = premap f id first :: a b c -> a (b,d) (c,d) -- rest unchanged with instances instance PreArrow (->) instance PostArrow (->) instance Arrow (->) instance PreArrow (Kleisli m) instance Monad m => PostArrow (Kleisli m) instance Monad m => Arrow (Kleisli m) The default implementation of arr could alternatively be postmap f id, but one has to choose one; perfect symmetry is unattainable. There's also the question of whether it's worth interposing a method-less class class (PreArrow a, PostArrow a) => Profunctor a

As it stands this proposal would mean a real performance hit for multiple
performance sensitive packages of mine or two reasons:
1.)
Due to the fact that GHC can't optimize strict function composition
http://hackage.haskell.org/trac/ghc/ticket/7542
and that (Foo . f) expands to (\x -> f x) we use unsafeCoerce in many
places to generate ideal core.
In 3.7, this happened entirely on function arrows and so we could contain
the noise entirely in the lens package.
In order to generate the correct core when faced with the same issue in
profunctors, we were forced to add an {-# LANGUAGE Unsafe #-} module
Data.Profunctor.Unsafe to the profunctors package that let the user invoke
an operator like the `lmap` and `rmap` of a profunctor, but which had
strict semantics and which could be unsafeCoerce. Used correctly this
allows us to maintain SafeHaskell guarantees and still generate the ideal
core.
This eta expansion isn't a trivial issue. From a constant perspective it
means a practical speed difference of 50% in many lens combinators vs. just
dealing with the eta expansion and bad semantics that result from just
using (.) or in the profunctor case, rmap or lmap.
But worse, accumulation of eta expansion wrappers can result in asymptotic
slowdowns. This is witnessed by another ghc bug that we found during work
on lens, where the derived Functor, Foldable and Traversable instances
provided by GHC run in O(n^2) time:
http://hackage.haskell.org/trac/ghc/ticket/7436
And the slowdown can be a factor of 10 even before these asymptotic factors
kick in since to get the correct semantics you need the strict composition
from the first ticket and GHC is terrible at optimizing it.
I'm pretty strongly against losing that much performance.
2.)
Profunctor as it exists provides an efficient dimap that lets you map over
both sides.
This doesn't matter for some 'single arrow' like Profunctors, but it does
matter for the profunctors for Mealy and Moore machines, where now any
dimap requires two passes. That said, I have not measured the slowdown on
my machines package.
Putting in a separate Profunctor class for the union of functionality
between Prearrow and Postarrow (PostFunctor? as it has little to do with
Arrow) is problematic for a couple of reasons.
If we simply make it an alias for Prearrow + Postarrow:
class (Prearrow p, Postarrow p) => Profunctor p
then either
a.) users have to specify an instance for a class with no laws (or which
just contains dimap with a default definition in terms of lmap and rmap),
which is troubling and inconvenient for users.
b.) we have to use appropriately Flexible and Undecidable instance, which
means it probably can't find its way into base and a part of the community
will rebel against it on moral grounds, and we can't get an efficient dimap
in that case regardless.
One compromise would be to drop the Postarrow notion despite 'Profunctor'
being 'too big', and just have:
class Prefunctor p => Profunctor p
with Profunctor p and Category p as superclasses to Arrow.
This loses some of the minimalism of your proposal but can be implemented
in a way that provides for the above concerns.
But if moving Profunctor into the standard library means losing
Data.Profunctor.Unsafe, then I still have an objection.
Losing that module forces me to choose between the a number of bad
scenarios:
a 50% constant factor hit w/ bad semantics around _|_'s and asymptotic
slowdowns
a 10-fold speed hit and good semantics
exposing a user visible unsafeCoerce
requiring some otherwise completely unnecessary class to recover the
existing semantics and dealing with user confusion
I'm not a fan of any of these scenarios.
-Edward Kmett
On Mon, Jan 7, 2013 at 7:05 AM, Ross Paterson
On Sun, Jul 15, 2012 at 06:22:01PM +0100, Ross Paterson wrote:
I propose to refactor the Arrow class, so that GHC's arrow notation can be a bit more general. (The module Control.Arrow itself would remain standard Haskell.)
There was resistance to this on two grounds: - a Profunctor class would be more principled - if we're going to change Arrow, we should make other changes too
I think the second set of changes is orthogonal, and should be considered separately.
Regarding the first, it's certainly true that the original proposal was unsatisfyingly asymmetrical, but the PreArrow class is what I want to generalize most of arrow notation to, and Profunctor would be too much. So here's a revised proposal, which amounts to splitting the Profunctor class into two independent classes:
-- | A binary type constructor that is contravariant in its first argument class PreArrow a where premap :: (b -> b') -> a b' c -> a b c
-- | A binary type constructor that is covariant in its second argument class PostArrow a where postmap :: (c -> c') -> a b c -> a b c'
and changing the Arrow class from
class Category a => Arrow a where arr :: (b -> c) -> a b c first :: a b c -> a (b,d) (c,d) -- various functions made methods to allow efficient specializations
to
class (Category a, PreArrow a, PostArrow a) => Arrow a where arr :: (b -> c) -> a b c arr f = premap f id
first :: a b c -> a (b,d) (c,d) -- rest unchanged
with instances
instance PreArrow (->) instance PostArrow (->) instance Arrow (->) instance PreArrow (Kleisli m) instance Monad m => PostArrow (Kleisli m) instance Monad m => Arrow (Kleisli m)
The default implementation of arr could alternatively be postmap f id, but one has to choose one; perfect symmetry is unattainable.
There's also the question of whether it's worth interposing a method-less class
class (PreArrow a, PostArrow a) => Profunctor a
_______________________________________________ Libraries mailing list Libraries@haskell.org http://www.haskell.org/mailman/listinfo/libraries

Is the Unsafe module all about getting efficient conversions between types that have the same representations due to embedded newtypes? (If so, the issue is not about functoriality.)

It is about functoriality in the sense that by providing a custom
implementation for the unsafe method that uses unsafeCoerce you are
asserting that your Functor really is one and doesn't do any GADT-like
tricks on the functor argument and structurally complies with the laws.
It lets you lift core `cast`'s out over the functorial argument, which
isn't something I can do from outside of the class. If I tried to write
something where the end user hands me an arbitrary Functor (or Profunctor)
and I unsafeCoerce to cast, this would expose unsafeCoerce to the end user.
The implementation trick is to place these extra methods in the class but
hidden in an explicitly Unsafe module and with default definitions that are
correct but slow.
Then the provider of the functor-like class can explicitly import that
module, and implement the methods, and mark his module Trustworthy. He
hasn't exposed unsafeCoerce to the end user, they have to import an
explicitly Unsafe module to get access to it, incurring the obligation
themselves to provide something that is operationally id or a cast.
This enables you to have the efficient implementation but guarded by an
explicitly Unsafe module so the end user has to import that to get the
efficient functionality, but you can discharge your obligations locally.
Similarly you can discharge the obligation about the representation of the
operation you are passing at the use site. This means that you can reason
about these separately.
In theory a similar operation could be exposed for Functor in a similar
Unsafe module permitting a more efficient implementation of vacuous to be
implemented soundly in Data.Void preserving sharing and changing the
asymptotics of the casting from f Void for data types where the end user is
willing to incur the reasoning obligation, but frankly, I'm not willing to
fight that battle today.
-Edward
On Tue, Jan 8, 2013 at 8:16 PM, Ross Paterson
Is the Unsafe module all about getting efficient conversions between types that have the same representations due to embedded newtypes? (If so, the issue is not about functoriality.)

On Wed, Jan 09, 2013 at 07:57:02AM +0000, Edward Kmett wrote:
It lets you lift core `cast`'s out over the functorial argument, which isn't something I can do from outside of the class. If I tried to write something where the end user hands me an arbitrary Functor (or Profunctor) and I unsafeCoerce to cast, this would expose unsafeCoerce to the end user.
The implementation trick is to place these extra methods in the class but hidden in an explicitly Unsafe module and with default definitions that are correct but slow.
Then the provider of the functor-like class can explicitly import that module, and implement the methods, and mark his module Trustworthy. He hasn't exposed unsafeCoerce to the end user, they have to import an explicitly Unsafe module to get access to it, incurring the obligation themselves to provide something that is operationally id or a cast.
This enables you to have the efficient implementation but guarded by an explicitly Unsafe module so the end user has to import that to get the efficient functionality, but you can discharge your obligations locally.
Similarly you can discharge the obligation about the representation of the operation you are passing at the use site. This means that you can reason about these separately.
You could achieve the same effects by splitting off (.#) and (#.) into a subclass, couldn't you? It would just mean more instance declarations.

This is the moral equivalent of saying that Data.Functor doesn't need (<$) and Control.Applicative doesn't need (<*) and (*>) after all, its just instance declarations.
Sent from my iPhone
On Jan 9, 2013, at 11:10 AM, Ross Paterson
On Wed, Jan 09, 2013 at 07:57:02AM +0000, Edward Kmett wrote:
It lets you lift core `cast`'s out over the functorial argument, which isn't something I can do from outside of the class. If I tried to write something where the end user hands me an arbitrary Functor (or Profunctor) and I unsafeCoerce to cast, this would expose unsafeCoerce to the end user.
The implementation trick is to place these extra methods in the class but hidden in an explicitly Unsafe module and with default definitions that are correct but slow.
Then the provider of the functor-like class can explicitly import that module, and implement the methods, and mark his module Trustworthy. He hasn't exposed unsafeCoerce to the end user, they have to import an explicitly Unsafe module to get access to it, incurring the obligation themselves to provide something that is operationally id or a cast.
This enables you to have the efficient implementation but guarded by an explicitly Unsafe module so the end user has to import that to get the efficient functionality, but you can discharge your obligations locally.
Similarly you can discharge the obligation about the representation of the operation you are passing at the use site. This means that you can reason about these separately.
You could achieve the same effects by splitting off (.#) and (#.) into a subclass, couldn't you? It would just mean more instance declarations.
_______________________________________________ Libraries mailing list Libraries@haskell.org http://www.haskell.org/mailman/listinfo/libraries

The situation with Functor involves three parties, me writing lens combinators that are agnostic to the Functor chosen, the user, and the libraries they use. Nobody is in the proper position to creat unorphaned instances of these new classes. I routinely use the lens to augment fuctionality from libraries that do not depend on lens.
With Profunctor, currently I provide the package, and so I am both the combinator author and currently the library author. If you decompose Profunctor and move it into base all of a sudden, I'm in the same boat as with Functor.
Spontaneously users are able to use any Arrow at all for
type Iso s t a b = forall p. (Profunctor p, Functor p) => p a (f b) -> p s (f t)
which is great! but it pretty much would force me to give the current efficiency gains I get in practice, or draw silly distinctions between FastIso and Iso.
The extra instance quickly becomes 200 combinators that differ solely in this one detail. When the alternative costs me nothing.
I'm currently strongly -1 on PostArrow or a lobotomized Profunctor.
PreArrow doesn't really affect me. I'm happily +1 on that.
Sent from my iPhone
On Jan 9, 2013, at 11:10 AM, Ross Paterson
On Wed, Jan 09, 2013 at 07:57:02AM +0000, Edward Kmett wrote:
It lets you lift core `cast`'s out over the functorial argument, which isn't something I can do from outside of the class. If I tried to write something where the end user hands me an arbitrary Functor (or Profunctor) and I unsafeCoerce to cast, this would expose unsafeCoerce to the end user.
The implementation trick is to place these extra methods in the class but hidden in an explicitly Unsafe module and with default definitions that are correct but slow.
Then the provider of the functor-like class can explicitly import that module, and implement the methods, and mark his module Trustworthy. He hasn't exposed unsafeCoerce to the end user, they have to import an explicitly Unsafe module to get access to it, incurring the obligation themselves to provide something that is operationally id or a cast.
This enables you to have the efficient implementation but guarded by an explicitly Unsafe module so the end user has to import that to get the efficient functionality, but you can discharge your obligations locally.
Similarly you can discharge the obligation about the representation of the operation you are passing at the use site. This means that you can reason about these separately.
You could achieve the same effects by splitting off (.#) and (#.) into a subclass, couldn't you? It would just mean more instance declarations.
_______________________________________________ Libraries mailing list Libraries@haskell.org http://www.haskell.org/mailman/listinfo/libraries
participants (6)
-
Edward A Kmett
-
Edward Kmett
-
Erik Hesselink
-
Ross Paterson
-
Twan van Laarhoven
-
wren ng thornton