
Sorry for the late reply.
I think the reason I proposed to reuse the algorithm for Functor was
that (A) as Pedro says, it was the class most closely resembling the
classes we wanted to write at the time, and (B) I almost certainly was
not at all aware that there is special magic in the code for
contravariant arguments. In general, I'm not overly eager to try to be
too clever in inferring the right instance constraints. I think
standalone deriving should be used in anything but the most
straight-forward scenarios. If there is a simple and easy-to-specify
way to infer the simple cases properly though, I am certainly not
opposed to it.
Cheers,
Andres
On Sat, Jun 18, 2016 at 1:55 PM, José Pedro Magalhães
On Sat, Jun 18, 2016 at 12:51 PM, Simon Peyton Jones
wrote: But no need to look at the data type’s constructors, as deriving(Functor) does.
Yes, that's right.
I believe we've used the "derive Functor" strategy for inferring constraints simply because all generic functions (over Generic1) that we had in mind at the time were Functor-like, so that was an appropriate first solution. But I totally agree that it can be improved!
Best regards, Pedro
Simon
From: josepedromagalhaes@gmail.com [mailto:josepedromagalhaes@gmail.com] On Behalf Of José Pedro Magalhães Sent: 18 June 2016 09:16 To: Simon Peyton Jones
Cc: Ryan Scott ; Andres Löh ; GHC developers Subject: Re: Inferring instance constraints with DeriveAnyClass I still don't think you can do it just from the default method's type. A typical case is the following:
class C a where
op :: a -> Int
default op :: (Generic a, GC (Rep a)) => a -> Int
When giving an instance C [a], you might well find out that you need C a =>, but this is not something
you can see in the type of the default method; it follows only after the expansion of Rep [a] and resolving
the GC constraint a number of times.
Best regards,
Pedro
On Fri, Jun 17, 2016 at 12:43 PM, Simon Peyton Jones
wrote: | My question is then: why does DeriveAnyClass take the bizarre approach | of co-opting the DeriveFunctor algorithm? Andres, you originally | proposed this in #7346 [2], but I don't quite understand why you | wanted to do it this way. Couldn't we infer the context simply from | the contexts of the default method type signatures?
That last suggestion makes perfect sense to me. After all, we are going to generate an instance looking like
instance .. => C (T a) where op1 = <default-op1> op2 = <default-op2>
so all we need in ".." is enough context to satisfy the needs of <default-op1> etc.
Well, you need to take account of the class op type sig too:
class C a where op :: Eq a => a -> a default op :: (Eq a, Show a) => a -> a
We effectively define default_op :: (Eq a, Show a) => a -> a
Now with DeriveAnyClass for lists, we effectively get
instance ... => C [a] where op = default_op
What is ..? Well, we need (Eq [a], Show [a]); but we are given Eq [a] (because that's op's instantiated type. So Show a is all we need in the end.
Simon _______________________________________________ ghc-devs mailing list ghc-devs@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/ghc-devs