deriving weirdness on newtypes

So I love the fact that I can derive anything I want on newtypes. However, there seem to be problems with it. If I write: newtype Foo = Foo Int deriving (Show) x = show (Foo 5) Then x is "Foo 5" However, if I do newtype Foo = Foo Int deriving (Num) x = show (Foo 5) Then the definition of 'x' claims that there is no Show instance of Foo. However (this is where it gets weird). If I do: newtype Foo = Foo Int deriving (Num) x = show ((Foo 5) + 4) then x is *valid* and has value "9", not "Foo 9". Moreoever, I can even do: x = show ((5::Foo) + 4) and this x has value "9". Futhermore, if I do: newtype Foo = Foo Int deriving (Show,Num) then "show (Foo 5)" has value "5", not "Foo 5". IMO, the correct value of "show (Foo 5)" no matter which of the above we have is "Foo 5". I think the deriving mechanism should be changed in two ways. (1) If you derive X on a newtype and X is a subclass of Y, you must have an instance of Y on that newtype. This is how it works on data types, and I think makes sense. For instance, newtype Foo = Foo Int deriving (Ord) should be just as illegal as data Foo = Foo Int deriving (Ord) (2) The derived Show instances for newtypes in the presense of derived Num instances should be fixed so that the constructor is also printed. Rational: adding something to the list of derived classes should *not* change previously derived instances. There is yet another thing. If we have: newtype Foo = Foo Int deriving (Num) instance Show Foo where { show = undefined } then, the value of 'show (Foo 5)' is undefined, but the value of 'show (5::Foo)' is "5". definately *WRONG*. - Hal -- Hal Daume III "Computer science is no more about computers | hdaume@isi.edu than astronomy is about telescopes." -Dijkstra | www.isi.edu/~hdaume

Hal Daume III
So I love the fact that I can derive anything I want on newtypes. However, there seem to be problems with it. If I write:
newtype Foo = Foo Int deriving (Show) x = show (Foo 5)
Then x is "Foo 5"
However, if I do
newtype Foo = Foo Int deriving (Num) x = show (Foo 5)
Then the definition of 'x' claims that there is no Show instance of Foo.
However (this is where it gets weird). If I do:
newtype Foo = Foo Int deriving (Num) x = show ((Foo 5) + 4)
then x is *valid* and has value "9", not "Foo 9".
Did you check the type of x? (I'd do it but I just found out that I no longer have GHCi after my 5.04 upgrade) Did you try different compilers? I guess that there's an automatically derived instance of Show (annoyingly being necessary for Num) after all, and that it is constructed as (show.toInteger) or something like that?
Moreoever, I can even do: x = show ((5::Foo) + 4)
Same thing, isn't it? 5::Foo = fromInteger 5 :: Foo = Foo 5
IMO, the correct value of "show (Foo 5)" no matter which of the above we have is "Foo 5". I think the deriving mechanism should be changed in two ways.
I agree.
(1) If you derive X on a newtype and X is a subclass of Y, you must have an instance of Y on that newtype. This is how it works on data types, and I think makes sense.
It could lead to rather long 'deriving' lists with a deep hierarchy, but apart from that, yeah. -kzm -- If I haven't seen further, it is by standing in the footprints of giants
participants (2)
-
Hal Daume III
-
ketil@ii.uib.no