A few answers back, Matthew Coolbeth wrote, 

(show v) where v is of type Test will evaluate to (showPrec v), and that (showPrec v) will evaluate to (show v).  It should be clear that this is a cyclic evaluation that will not terminate.

Presumably that was intentional. But why was that necessary?  It seems like strange coding.  Furthermore, why isn't

instance Show Test

interpreted to mean

data Test = Test deriving (Show)

-- Russ 



On Tue, Nov 16, 2010 at 10:27 AM, Tobias Brandt <tob.brandt@googlemail.com> wrote:
On 16 November 2010 19:17, Edward Z. Yang <ezyang@mit.edu> wrote:
> It seems to me that there is a possibility we could reify some information
> that is traditionally specified in the documentation: that is, what
> functions must be defined by a minimal instance, which could then give
> GHC enough information to give meaningful warnings if not all functions
> for a minimal instance are proviced.

One could use a compiler pragma the defines possible sets of minimal
definitions, e.g.
{-# MINIMAL_DEF Num ((+),(*),abs,signum,fromInteger,(-)),
((+),(*),abs,signum,fromInteger,negate) #-}

one could even add logical notation, like:
{-# MINIMAL_DEF Num ((+), (*), abs, signum, fromInteger, (-) || negate) #-}