
Hal Daume III
So I love the fact that I can derive anything I want on newtypes. However, there seem to be problems with it. If I write:
newtype Foo = Foo Int deriving (Show) x = show (Foo 5)
Then x is "Foo 5"
However, if I do
newtype Foo = Foo Int deriving (Num) x = show (Foo 5)
Then the definition of 'x' claims that there is no Show instance of Foo.
However (this is where it gets weird). If I do:
newtype Foo = Foo Int deriving (Num) x = show ((Foo 5) + 4)
then x is *valid* and has value "9", not "Foo 9".
Did you check the type of x? (I'd do it but I just found out that I no longer have GHCi after my 5.04 upgrade) Did you try different compilers? I guess that there's an automatically derived instance of Show (annoyingly being necessary for Num) after all, and that it is constructed as (show.toInteger) or something like that?
Moreoever, I can even do: x = show ((5::Foo) + 4)
Same thing, isn't it? 5::Foo = fromInteger 5 :: Foo = Foo 5
IMO, the correct value of "show (Foo 5)" no matter which of the above we have is "Foo 5". I think the deriving mechanism should be changed in two ways.
I agree.
(1) If you derive X on a newtype and X is a subclass of Y, you must have an instance of Y on that newtype. This is how it works on data types, and I think makes sense.
It could lead to rather long 'deriving' lists with a deep hierarchy, but apart from that, yeah. -kzm -- If I haven't seen further, it is by standing in the footprints of giants