
On Mon, Jul 30, 2007 at 11:47:46AM +0100, Jon Fairbairn wrote:
ChrisK
writes: And the readability is destroyed because you cannot do any type inference in your head.
If you see
{ Matrix m = ....; Matrix x = m * y; ...; }
Then you know very little about the possible types of y since can only conclude that:
[snippage] This is all very horrid, but as far as I can tell what I was proposing wouldn't lead to such a mess, except possibly via defaulting, which, as the least important aspect of the idea could easily be abandoned.
What your suggestion would do would be to make the type inferred for every pattern-matched function polymorphic, which means that in order to determine the correctness of a function you'd need to examine all other modules. Similarly, if you fail to include a type signature in some simple pattern-matched function in a where clause, adding an import of another module could make that function fail to compile (with an undeterminable type error). This isn't so horrid as C++, but also isn't nearly so beautiful as Haskell. Admittedly, adding a type signature will make a function verifiably correct, and avoid any of these ambiguities, but we really like type inference, and it'd be a shame to introduce code that makes type inference less powerful. True, one could always forbid people to use the View class, but that sort of defeats the purpose, and starts sounding once more like C++, where there are language features that "shouldn't" be used... but just imagine what would happen to your type checking, if someone decided that it'd be clever to use [a] as a view for Integer using a Peano representation? Yikes! (Or Integer as a view for [a] describing the length?) Admittedly, havoc would also be wreaked if someone declared [a] to be an instance of Num, and that's the risk one takes when using type classes... but that's why it's nice that there is a convenient way to write code that *doesn't* use type classes. -- David Roundy Department of Physics Oregon State University