
http://www.haskell.org/onlinereport/decls.html#default-decls
http://www.haskell.org/tutorial/numbers.html#sect10.4
On 1/12/06, Jeff.Harper@handheld.com
What does the Haskell type system do with expressions such as these . . . ? show 1 show (1+2)
The type of the subexpressions "1" and "1+2" are "ambiguous" since they have type "(Num a) => a". I'm under the assumption before "1+2" is evaluated, the "1" and "2" must be coerced into a "concrete" type such as Int, Integer, Double, etc, and before "show 1" is evaluated, the "1" must be coerced into a "concrete" type. Is my assumption correct? If so, how does Haskell know into which type to coerce the subexpressions?
If I try to write a new function, "my_show", which converts an expression into a string representation that includes type information, I run into errors with expressions like "show 1" and "show (1+2)" because of the type ambiguity.
class (Show a) => My_show a where my_show :: a -> String
instance My_show Int where my_show a = show a ++ " :: Int"
instance My_show Integer where my_show a = show a ++ " :: Integer"
I can avoid the errors if I change it to "my_show (1::Int)" or "my_show ((1+2)::Int). I'm wondering what the difference is between, my_show and Haskell's built-in show that causes my_show to produce an error message when it is used with ambiguous types, but Haskell's show works okay with ambiguous types.
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
-- jupdike@gmail.com http://www.updike.org/~jared/ reverse ")-:"