
Hi, Is there some way to tell ghc, how to interpret numeric literals? I would like it to interpret 1 as 1 :: Integer not 1 as fromInteger (1 :: Integer) I have been playing with the following (rather ugly) code. {-# OPTIONS -XFunctionalDependencies -XMultiParamTypeClasses -XTypeSynonymInstances #-} module Test where import Prelude(Integer, Double) import qualified Prelude as P default(Integer, Double) type Z = Integer type R = Double class Plus a b c | a b -> c where (+) :: a -> b -> c instance Plus Z Z Z where j + k = j P.+ k instance Plus R R R where x + y = x P.+ y instance Plus R Z R where x + j = x P.+ P.fromInteger j instance Plus Z R R where j + x = P.fromInteger j P.+ x x = (1 :: Z) + (1.0 :: R) --y = 1 + 1.0 The commented line can't be compiled (because of ambiguous types, I think). - Lauri