
So I love the fact that I can derive anything I want on newtypes. However, there seem to be problems with it. If I write: newtype Foo = Foo Int deriving (Show) x = show (Foo 5) Then x is "Foo 5" However, if I do newtype Foo = Foo Int deriving (Num) x = show (Foo 5) Then the definition of 'x' claims that there is no Show instance of Foo. However (this is where it gets weird). If I do: newtype Foo = Foo Int deriving (Num) x = show ((Foo 5) + 4) then x is *valid* and has value "9", not "Foo 9". Moreoever, I can even do: x = show ((5::Foo) + 4) and this x has value "9". Futhermore, if I do: newtype Foo = Foo Int deriving (Show,Num) then "show (Foo 5)" has value "5", not "Foo 5". IMO, the correct value of "show (Foo 5)" no matter which of the above we have is "Foo 5". I think the deriving mechanism should be changed in two ways. (1) If you derive X on a newtype and X is a subclass of Y, you must have an instance of Y on that newtype. This is how it works on data types, and I think makes sense. For instance, newtype Foo = Foo Int deriving (Ord) should be just as illegal as data Foo = Foo Int deriving (Ord) (2) The derived Show instances for newtypes in the presense of derived Num instances should be fixed so that the constructor is also printed. Rational: adding something to the list of derived classes should *not* change previously derived instances. There is yet another thing. If we have: newtype Foo = Foo Int deriving (Num) instance Show Foo where { show = undefined } then, the value of 'show (Foo 5)' is undefined, but the value of 'show (5::Foo)' is "5". definately *WRONG*. - Hal -- Hal Daume III "Computer science is no more about computers | hdaume@isi.edu than astronomy is about telescopes." -Dijkstra | www.isi.edu/~hdaume