
On 02/23/2015 03:46 PM, Stuart Hungerford wrote:
instance (AddSemigroup a, AddSemigroup b) => AddSemigroup (a, b)
GHC says it "Could not deduce (Num (a, b))" in this situation, which seems fair enough so I tried:
instance (AddSemigroup a, Num a, AddSemigroup b, Num b) => AddSemigroup (a, b)
Since you've defined the "AddSemigroup plus" to be the "Num plus", when you attempt to make (a,b) an instance of AddSemigroup, GHC tries to use the "Num plus" on them; i.e. (x1,y1) + (x2,y2) = ??? Since (a,b) isn't an instance of Num, it doesn't know what to do here. It's obvious that you want, (x1,y1) + (x2,y2) = (x1+x2, y1+y2) (which is fine, since both 'a' and 'b' are instance of Num), but you could just as well declare, (x1,y1) + (x2,y2) = (x1*x2, y1*y2) The compiler doesn't know, so declaring 'a' and 'b' as instances of Num isn't enough. You really have to tell it how to add the pair.
With the same result. Separate instances seem to work though:
instance (Num a, Num b) => Num (a, b)
instance (AddSemigroup a, AddSemigroup b) => AddSemigroup (a, b)
Now it knows how to add (x1,y1) and (x2,y2), since (a,b) is an instance of Num. The compiler won't infer the pair instance from the individual ones though.