
On Thu, Oct 29, 2009 at 8:37 AM, Philip K.F.
Dear GHCers,
On Wed, 2009-10-28 at 12:14 +0000, John Lato wrote:
This had me confused for a while, but I think I've worked out what's happening. (+) is polymorphic, ...
Oh darn, it sounds like you're right. And polymorphism is so common. I just came up with that example randomly as the first nontrivial type-error-with-a-lambda I could think of...
I think this is a great example of why the current type errors are not as helpful as they could be. The code where the type checker determines 'x' has type [a] is several steps removed from where the error arises. This is how I understand this process (I've probably left out some details):
I am a little ambiguous on this issue; I usually find GHC's type errors make me realize what I did wrong very quickly, i.e. until I start messing with combinations of GADTs, type classes and type families. When I've looked at an error for too long without understanding what's happening, I usually look for ways to express the same problem in simpler constructs.
This case has me stumped, though.
1. checker infers x :: [a] from 'length x' 2. checker infers (3 + x) :: [a] from (+) and step 1 3. checker infers the second (+) :: [a] -> [a] -> [a] from step 2
Pardon? Judging from the error GHC gives, you must be right, but isn't *this* the point where things should go wrong? I'm not too intimately familiar with the type checker's internals, so this example leads me to speculate that "normal" types are inferred and checked *before* type class constraints are evaluated.
This "(+) :: [a] -> [a] -> [a]" seems wrong from an intuitive sense, but the type checker doesn't know that. We know that (+) :: Num t => t -> t -> t It's completely legal (though maybe ill-advised) to construct a Num instance for [a]. Even if that instance isn't in scope when this code is compiled, it could be added later. I should probably wait for a GHC guru to respond to this one, because I'm in the realm of speculation here, but I expect that "normal" types need to be inferred, unified, and checked before type class constraints can be applied. In the case where a constraint is necessary, either the type is a concrete type, e.g. Int, Char, for which the class instance must be in scope, or the type is polymorphic, e.g. [a], in which case the constraint must be passed up to the context of where it's used. The compiler doesn't know which is the case (or exactly what context is necessary) until it's finished checking the "normal" types.
However, I would have wanted this error:
Prelude> [1] + [2]
<interactive>:1:0: No instance for (Num [t]) arising from a use of `+' at <interactive>:1:0-8 Possible fix: add an instance declaration for (Num [t]) In the expression: [1] + [2] In the definition of `it': it = [1] + [2]
In other words: isn't the problem in this case that the type checker does not gather all information (no instance of type class Num) to give the proper error? Is gathering type class information after "normal" types have already conflicted even possible?
Just because a type class isn't in scope doesn't mean it will never be in scope. In order for this to work, you'd need to distinguish between code that will never be linked to from elsewhere and code that will. I don't think this is possible without completely changing the compilation chain. Ghci can do it because, as an interpreter, it doesn't produce object code to be linked to anyway. John