
Apart from the recent Monoid problems, I'm a bit concerned about the non-portable ReadP creeping into some modules, making them non-H98. This affects e.g. Cabal and even something as simple as System.Info (which incorrectly claims that is portable). IMHO we should really keep at least those portable, even if this would involve a little bit more work. Is local universal quantification really needed for those simple parsing tasks? It looks a bit like overkill here... Cheers, S.

On Fri, Feb 25, 2005 at 11:46:42PM +0100, Sven Panne wrote:
Apart from the recent Monoid problems, I'm a bit concerned about the non-portable ReadP creeping into some modules, making them non-H98. This affects e.g. Cabal and even something as simple as System.Info (which incorrectly claims that is portable). IMHO we should really keep at least those portable, even if this would involve a little bit more work. Is local universal quantification really needed for those simple parsing tasks? It looks a bit like overkill here...
Cabal has lots of compatibility gunk to preserve portability. It uses Distribution.Compat.ReadP, which presents a Haskell 98 interface, implemented using Text.ParserCombinators.ReadP for those that have it and re-implemented for the rest. So even though it's forcing a -98 in Hugs, it is portable, i.e. offered by every system with the same interface. Data.Version, on the other hand, has a compiler-dependent interface: #if __GLASGOW_HASKELL__ >= 603 || __HUGS__ parseVersion :: ReadP Version #elif __NHC__ parseVersion :: ReadPN r Version #else parseVersion :: ReadP r Version #endif System.Info imports that, but also presents a Haskell 98 interface.

ross@soi.city.ac.uk wrote:
Cabal has lots of compatibility gunk to preserve portability. It uses Distribution.Compat.ReadP, which presents a Haskell 98 interface, implemented using Text.ParserCombinators.ReadP for those that have it and re-implemented for the rest. [...]
OK, then let's reformulate my question: Why do we have a non-H98 ReadP at all? If I see it correctly, it is only to save one type parameter, but this looks like a bad trade-off. We could deprecate the current non-H98 ReadP then (keeping it for backwards compatibility) and offer a new pure H98 version. This would avoid the current #ifdef horror and keep the interfaces compiler-independent, both are worthy goals IMHO... Cheers, S.

On Feb 26, 2005, at 11:02 AM, Sven Panne wrote:
ross@soi.city.ac.uk wrote:
Cabal has lots of compatibility gunk to preserve portability. It uses Distribution.Compat.ReadP, which presents a Haskell 98 interface, implemented using Text.ParserCombinators.ReadP for those that have it and re-implemented for the rest. [...]
OK, then let's reformulate my question: Why do we have a non-H98 ReadP at all?
My view: The existential types ensure that the continuation is used correctly. That's a pretty good reason to use them (rather than an extra type parameter) during development. Once there's a working library, why make changes? This is true of an awful lot of libraries which use existential types, by the way. We could add extra witness type parameters, and the code would work without modification. Some libraries even use a closed set of types existentially, permitting them to be eliminated completely. All those libraries then become H98 as if by magic. The problem? Clunkier types, fewer guarantees, lots of opportunities to introduce bugs. That said, I'd love to see more H98 versions of non-H98 libraries. Often a bit of thought can eliminate gratuitous use of a language feature which seemed important during development. -Jan-Willem Maessen

Isn't H98 a bit old now... Surely some of the common extensions are well known and useful enough to be standardised... I read somewhere the Haskell standards commitee has been dispanded - and that the consensus of implementers would replace it. Surely all that has to happen is the ghc, hugs, nhc, (and others) get together and recognise some extensions (say multi-parameter types, fundeps) as compliant to Haskell-2005 and we have an updated standard? Keean. Sven Panne wrote:
Apart from the recent Monoid problems, I'm a bit concerned about the non-portable ReadP creeping into some modules, making them non-H98. This affects e.g. Cabal and even something as simple as System.Info (which incorrectly claims that is portable). IMHO we should really keep at least those portable, even if this would involve a little bit more work. Is local universal quantification really needed for those simple parsing tasks? It looks a bit like overkill here...
Cheers, S. _______________________________________________ Libraries mailing list Libraries@haskell.org http://www.haskell.org/mailman/listinfo/libraries

Keean Schupke wrote:
Isn't H98 a bit old now... Surely some of the common extensions are well known and useful enough to be standardised... I read somewhere the Haskell standards commitee has been dispanded - and that the consensus of implementers would replace it. Surely all that has to happen is the ghc, hugs, nhc, (and others) get together and recognise some extensions (say multi-parameter types, fundeps) as compliant to Haskell-2005 and we have an updated standard?
From the discussions in recent years it is far from clear to me what would be an accepted extension. The current type system extensions interfere with each other and it looks like some can be emulated by others. Furthermore, nhc98 implements almost none of these extensions, but I guess Malcolm would happily accept any volunteers. :-) Furthermore, there's still hbc, tools which operate on Haskell sources, etc. etc. Keeping the non-H98 part of the standard libraries low is still a worthy goal, even in 2005... Note that I'm not against any extensions, I'd be more than happy if a new revision of H98 would emerge, but I seriously doubt that simply declaring what 2 implementations provide as a standard is the right way to proceed. Perhaps we should make a few more carefully selected addenda to the H98 report. And while I'm at this topic: Shouldn't hierarchical modules be promoted to an official, finalised addendum? There hasn't been much dispute about this topic in the last few years... :-) Cheers, S.

On Sat, Feb 26, 2005 at 08:19:06PM +0100, Sven Panne wrote:
From the discussions in recent years it is far from clear to me what would be an accepted extension. The current type system extensions interfere with each other and it looks like some can be emulated by others. Furthermore, nhc98 implements almost none of these extensions, but I guess Malcolm would happily accept any volunteers. :-) Furthermore, there's still hbc, tools which operate on Haskell sources, etc. etc. Keeping the non-H98 part of the standard libraries low is still a worthy goal, even in 2005...
Note that I'm not against any extensions, I'd be more than happy if a new revision of H98 would emerge, but I seriously doubt that simply declaring what 2 implementations provide as a standard is the right way to proceed. Perhaps we should make a few more carefully selected addenda to the H98 report.
Indeed. Of all the extensions implemented by both GHC and Hugs, the only ones that seem ready are - rank 2 type signatures, and - polymorphic components for data constructors (giving them rank 2 types). The extensions to the type system are well understood, the operational semantics is unaffected, and they get you useful things like runST, Church encodings and polymorphic continuation monads (like ReadP). Though polymorphic components do complicate the denotational semantics.

G'day all. Quoting ross@soi.city.ac.uk:
Indeed. Of all the extensions implemented by both GHC and Hugs, the only ones that seem ready are
- rank 2 type signatures, and
- polymorphic components for data constructors (giving them rank 2 types).
Off the top of my head: - multi-parameter type classes - pattern guards - scoped type variables - recursive "do" - data declarations with no constructors - constraints on typeclass methods - instances on type synonyms Even multi-parameter type classes are pretty well-understood Cheers, Andrew Bromage

On Sat, Feb 26, 2005 at 08:00:22PM -0500, ajb@spamcop.net wrote:
Quoting ross@soi.city.ac.uk:
Indeed. Of all the extensions implemented by both GHC and Hugs, the only ones that seem ready are
- rank 2 type signatures, and
- polymorphic components for data constructors (giving them rank 2 types).
Off the top of my head:
- multi-parameter type classes
Reasonable in themselves, but limited in usefulness without some scheme to deal with overlapping instances, which doesn't seem settled at this time.
- pattern guards
GHC only
- scoped type variables
Different treatment in GHC and Hugs. Can you point at a formal calculus corresponding to the version in GHC?
- recursive "do"
The treatment of recursion isn't obviously ideal: conflicts with ordinary do, and the semantics depends on the dependency analysis.
- data declarations with no constructors
Minor, but harmless.
- constraints on typeclass methods
- instances on type synonyms
Yes, these two and the relaxation of polymorphic recursion should have been in H98.

It appears to me that there are important questions that need to be settled, unless ghc is to become a language of its own, rather than an implementation. There's nothing wrong with having ghc specific features, but only if this is done on purpose. Lots of things can result from such a discussion. Maybe there isn't a consensus because there are really two necessary constructs rather than one. Perhaps a consensus can be reached that allows one language extension to handle the requirements that have been exposed during the process of developing the libraries. Would it not be a good idea to CC these messages onto whichever list is directly concerned with language extensions and changes? The issues may be exposed by library development, but they are clearly of interest to both the language developers and the library developers. ross@soi.city.ac.uk wrote:
On Sat, Feb 26, 2005 at 08:00:22PM -0500, ajb@spamcop.net wrote:
Quoting ross@soi.city.ac.uk:
Indeed. Of all the extensions implemented by both GHC and Hugs, the only ones that seem ready are
- rank 2 type signatures, and
- polymorphic components for data constructors (giving them rank 2 types).
Off the top of my head:
- multi-parameter type classes
Reasonable in themselves, but limited in usefulness without some scheme to deal with overlapping instances, which doesn't seem settled at this time.
- pattern guards
GHC only
True, it is ghc only. One of the advantages of developing additions to the standard that allow this usage and define it in a precise way. If pattern guards are important (I personally think they are), then they should be formally designated as either an GHC extension to the language or a valuable extension that deserves to become part of a standard that compiler developers and library developers (both GHC and non-GHC) can reach a consensus. This would, in this particular case, result in a very interesting discussion. Do they truly make the language richer, or are there other equally effective ways to do the same thing? It is evident that the GHC developers added this feature because they believe that it is an important extension to the language.
- scoped type variables
Different treatment in GHC and Hugs. Can you point at a formal calculus corresponding to the version in GHC?
- recursive "do"
The treatment of recursion isn't obviously ideal: conflicts with ordinary do, and the semantics depends on the dependency analysis.
Then this is a good opportunity to define the syntax and semantics of this facility that can be uniformly implemented in a non-ambiguous way that does not break the H98 do syntax. Since do is basically a shorthand for a more complex and unwieldy coding, it might be logical to map a typical recursive do onto syntax that does not rely on do. (In H98 everything that "do" can do can be done without "do". (That has to be a candidate for worst sentence of the year. :) ) If the semantics is not deterministic, that is an obvious case where a consensus of how recursive do works is certainly needed. I agree with Ross that this isn't a settled question; but IMHO it needs to be settled. One core principle of functional programming in general and Haskell in particular is that you _know_ what your code will do. If recursive "do" is an important feature (I believe that most Haskell people would agree that it is), then the behavior has to be consistent, or it has to be disallowed in situations where it introduces non-determinism. I'm not knowledgeable enough to know whether that can be discovered at compile time. Clearly it is a complex issue and not trivial to implement, but, also clearly, it is currently being used and the results are acceptable to the library developer. I'd be very interested to know specifically why and under what circumstances the semantics is different. Not from the compiler perspective (Ross states that the semantics depend on the dependency analysis) but from the perspective of the coder. What are the possible semantics? Is the ambiguity common or rare? What happens if the library developer assumes one interpretation and the compiler generates a different one?
- data declarations with no constructors
Minor, but harmless.
- constraints on typeclass methods
- instances on type synonyms
Yes, these two and the relaxation of polymorphic recursion should have been in H98. _______________________________________________ Libraries mailing list Libraries@haskell.org http://www.haskell.org/mailman/listinfo/libraries
!DSPAM:4221288d198032093425033!

OK, let's try to summarize what seems to be the current consensus about more or less conservative & settled extensions to Haskell98: a) rank 2 type signatures b) polymorphic components for data constructors c) pattern guards d) data declarations with no constructors e) constraints on typeclass methods f) instances on type synonyms g) relaxation of polymorphic recursion Probably a) and b) should be considered together(?). Furthermore, currently only GHC implements c), but a simple desugaring transformation should be enough to implement it in the other Haskell systems and I don't see any hard semantic issues here. It would be nice if some volunteers would grab an extension and write a proposal for a Haskell98 addendum. Implementing one of these for Hugs/nhc98 might be good task for a student project or a (very) long weekend. Ross/Malcolm/Simon^2: How much effort (e.g. in man weeks) would be needed to implement each of these extensions? My main point here is that it would be very handy if we had independent, precise and small descriptions of the extensions currently in use. This makes the chance that they get implemented much higher and would help avoiding the fragmentation of the language. Cheers, S. P.S.: I still don't have an answer why we really need a non-H98 ReadP in the base libraries... :-)

On Sun, Feb 27, 2005 at 11:47:21AM +0100, Sven Panne wrote:
OK, let's try to summarize what seems to be the current consensus about more or less conservative & settled extensions to Haskell98:
a) rank 2 type signatures b) polymorphic components for data constructors c) pattern guards d) data declarations with no constructors e) constraints on typeclass methods f) instances on type synonyms g) relaxation of polymorphic recursion
I take back what I said about instances on type synonyms: they are tied up with generalized instance heads, which are certainly a major change from H98, as they break the termination guarantee.
Probably a) and b) should be considered together(?). Furthermore, currently only GHC implements c), but a simple desugaring transformation should be enough to implement it in the other Haskell systems and I don't see any hard semantic issues here.
All these but pattern guards are implemented in Hugs. Is such a desugaring transformation written down anywhere? That's essential for describing the extension, but wouldn't be usable as an implementation: users will be confused by type errors that relate to desugared forms. (I'm also not sure about the use of monadic binding notation for the non-monadic matches). e) is in GHC and Hugs without special flags. g) is in GHC, Hugs and nhc98, all without any special flags. I think we should be biased in favour of extensions that give extra expressiveness, like a), b) and e), over those that simplify things you could have written anyway.

Sven Panne
OK, let's try to summarize what seems to be the current consensus about more or less conservative & settled extensions to Haskell98:
There is a larger point to consider. We should remember that revising the Haskell language is not only about extending it, but also about removing dead/ugly features, and changing features that are wrong or inconsistent. Also, revision does not just concern compilers/interpreters; there is a host of other tools which also process Haskell, e.g. Haddock, Hat, DrIFT, hsc2hs, greencard, to name but a few. Some only require syntax, but others, particularly in the future, might depend on type information too. As examples of language features that should be removed or revised, how about these: (a) n+k patterns (b) the defaulting mechanism (c) the monomorphism restriction and in addition, a new version of the language should probably adopt a much smaller mandatory Prelude, moving existing items into optional libraries.
My main point here is that it would be very handy if we had independent, precise and small descriptions of the extensions currently in use.
Agreed. Regards, Malcolm

On Mon, 28 Feb 2005, Malcolm Wallace wrote:
Sven Panne
writes: OK, let's try to summarize what seems to be the current consensus about more or less conservative & settled extensions to Haskell98:
There is a larger point to consider. We should remember that revising the Haskell language is not only about extending it, but also about removing dead/ugly features, and changing features that are wrong or inconsistent.
Yes, fight against diabetes! :-] Modula-3 was designed to let the language definition fit in 50 pages - that's a serious goal. Since Haskell doesn't need many features of imperative languages like variables, constants, loops, etc. there is more space for type issues, and this goal still seems to be realizable for Haskell, too.

Andrew Pimlott
On Mon, Feb 28, 2005 at 01:45:25PM +0000, Malcolm Wallace wrote:
As examples of language features that should be removed or revised, how about these:
Perhaps
(d) the restriction of list comprehension to lists (instead of MonadPlus)
You do realise that we used to have monad comprehensions in Haskell 1.4, and they were deliberately removed for Haskell'98? The reason was that error messages for beginners' code were rather baffling. (And pretty baffling for some experts too!) In any case, the do notation is exactly analogous to list comprehensions with the restriction to a single monad lifted. Regards, Malcolm

On Mon, Feb 28, 2005 at 06:47:05PM +0000, Malcolm Wallace wrote:
Andrew Pimlott
writes: On Mon, Feb 28, 2005 at 01:45:25PM +0000, Malcolm Wallace wrote:
As examples of language features that should be removed or revised, how about these:
Perhaps
(d) the restriction of list comprehension to lists (instead of MonadPlus)
You do realise that we used to have monad comprehensions in Haskell 1.4, and they were deliberately removed for Haskell'98?
Yes, I heard this.
The reason was that error messages for beginners' code were rather baffling. (And pretty baffling for some experts too!)
I don't understand how they could be more baffling than any other type errors involving monads. Can you give an example? (I could not find any, only mentions of them.) Maybe error messages were less clear then? Or maybe understanding of monads was less pervasive? The only reason I can think of is psychological: List comprehensions look like lists, so one might not expect any other type to be considered. This is a weak reason in my opinion. For my part, I was initially baffled (and then sharply disappointed) when my Maybe comprehension wouldn't compile.
In any case, the do notation is exactly analogous to list comprehensions with the restriction to a single monad lifted.
Sure, but if list comprehension syntax is sometimes better for lists, it is sometimes better for other MonadPlus instances. Having two "exactly analogous" constructs in the language, but putting an arbitrary restriction on one, seems in poor taste. Andrew

The reason was that error messages for beginners' code were rather baffling. (And pretty baffling for some experts too!)
I don't understand how they could be more baffling than any other type errors involving monads. Can you give an example? (I could not find any, only mentions of them.) Maybe error messages were less clear then? Or maybe understanding of monads was less pervasive?
Beginners are taught lists and list comprehensions before they are taught monads. The errors they saw involved monads, which they didn't yet know about. Confusion reigned. --KW 8-)

On 2005-02-28 at 20:22GMT Keith Wansbrough wrote:
The reason was that error messages for beginners' code were rather baffling. (And pretty baffling for some experts too!)
I don't understand how they could be more baffling than any other type errors involving monads. Can you give an example? (I could not find any, only mentions of them.) Maybe error messages were less clear then? Or maybe understanding of monads was less pervasive?
Beginners are taught lists and list comprehensions before they are taught monads. The errors they saw involved monads, which they didn't yet know about. Confusion reigned.
True, but it was still the wrong solution to the problem; Something involving a "beginners prelude" -- which would fit well with the idea of less in the standard prelude -- would avoid the annoyance that mplus isn't (++) and so on. I don't think it's too onerous to tell beginners that "you have to put import Didactic {- or whatever -} at the front of all Haskell programmes" ... followed several lectures later by "OK, I lied". Jón -- Jón Fairbairn Jon.Fairbairn at cl.cam.ac.uk

On Mon, Feb 28, 2005 at 08:22:07PM +0000, Keith Wansbrough wrote:
The reason was that error messages for beginners' code were rather baffling. (And pretty baffling for some experts too!)
I don't understand how they could be more baffling than any other type errors involving monads. Can you give an example? (I could not find any, only mentions of them.) Maybe error messages were less clear then? Or maybe understanding of monads was less pervasive?
Beginners are taught lists and list comprehensions before they are taught monads. The errors they saw involved monads, which they didn't yet know about. Confusion reigned.
Well, the original poster referred to baffled experts as well. Anyway, I agree that beginners are a concern, though I am wary of making significant language decisions to accomodate them. Some thoughts: - Was main an IO action at that time? Now that it is, beginners have to use monads pretty early on anyway. (Ok, they probably use the repl first.) - There could be a restricted form of the language in which comprehensions are always lists. (And perhaps numeric literals get a base type? And do is restricted to IO?) - On the flip side, monad comprehensions could be a good way to introduce the power of Haskell. Beginners can understand Maybe soon after understanding lists, so they could be shown Maybe comprehensions. I've never tried to teach anyone Haskell, so I don't know how well this would work, but I'm not convinced that the right trade-off has been made. This decision was made a long time ago, and maybe we can come up with better solutions now. Andrew

On Mon, Feb 28, 2005 at 06:47:05PM +0000, Malcolm Wallace wrote:
Andrew Pimlott
writes: On Mon, Feb 28, 2005 at 01:45:25PM +0000, Malcolm Wallace wrote:
As examples of language features that should be removed or revised, how about these:
Perhaps
(d) the restriction of list comprehension to lists (instead of MonadPlus)
You do realise that we used to have monad comprehensions in Haskell 1.4, and they were deliberately removed for Haskell'98? The reason was that error messages for beginners' code were rather baffling. (And pretty baffling for some experts too!) In any case, the do notation is exactly analogous to list comprehensions with the restriction to a single monad lifted.
Does the incredibly useful parallel list comprehension extension in ghc generalize to arbitrary monads? or MonadPluses even? John -- John Meacham - ⑆repetae.net⑆john⑈

On Mon, 28 Feb 2005, John Meacham wrote:
Does the incredibly useful parallel list comprehension extension in ghc generalize to arbitrary monads? or MonadPluses even?
Better ask: Is it possible to generalize zipWith to other data structures? Adding syntactic sugar for each and everything will quickly turn Haskell into Perl 7 or C++++.

As examples of language features that should be removed or revised, how about these:
(n+k, monomorphism, defaulting, monad comprehensions) All of this has been debated (endlessly!) once before. Not that it shouldn't be debated again, but the wrapup from long ago looks like this: n+k (the issue that won't die). Removing n+k would probably be fairly pointless, given that it doesn't simplify much (except syntax) and is in wide use. Points: * Some people would like to see views introduced in a way that makes current n+k a special case (views would be nice!) * Everyone agreed that n+k really should be used for some sort of special "Natural number" type. But we didn't want to add a new type. Monomorphism (another issue that won't die!). Without scoped type variables, monomorphism is essential to avoid ambiguity. If scoped type variables appear progress could be made. Defaulting. Again, lots of old debate on this. Nobody is particularly happy with the current design but something is needed. A more general mechanism would be welcomed but hasn't been implemented as far as I know. List comprehension / MonadPlus. This was definitely a mistake (in my humble opinion!) but the idea was that we wanted to create a less confusing environment for beginners. This could be changed back with relatively little effort but the bigger issue of how to insulate naive users from langage complexity is an important one. Any change to the Haskell standard is going to require a lot of effort. Haskell is what it is because so many people spent so much time getting things right. I would love to see a Haskell 2.0 someday - I hope that the process continues to be as rigorous as the one that let to Haskell 98. John

John Peterson wrote:
[...] Any change to the Haskell standard is going to require a lot of effort. Haskell is what it is because so many people spent so much time getting things right. I would love to see a Haskell 2.0 someday - I hope that the process continues to be as rigorous as the one that let to Haskell 98.
Perhaps I should make my motivation clear again: I don't want to start a "Haskell 2.0, guerrilla edition" by a more or less uncoordinated effort. I just want precise, separate definitions of the common extensions in use to avoid fragmentation of the language. It would be very nice to say e.g. "Hugs, March 2005 edition, supports Haskell 98 + addenda Foo, Bar & Baz". Currently some systems have similar extensions, but not identical ones, e.g. fundeps in Hugs & GHC, IIRC. This really has to be avoided. Cheers, S.

On Mon, Feb 28, 2005 at 01:45:25PM +0000, Malcolm Wallace wrote:
As examples of language features that should be removed or revised, how about these: (a) n+k patterns (b) the defaulting mechanism (c) the monomorphism restriction
(b) is annoying, but I don't see an alternative now. Removing the defaulting mechanism would be very backwards-incompatible, and require a lot of type signatures in annoying places. For instance, any time you exponentiate using ^ or ^^ and a literal exponent, like f x = x^2 you have to specify the type of '2' if you don't have defaulting. I encountered this with my proposed replacements for the numeric classes, where defaulting is necessarily broken. Usually it's not so much of an issue and only requires a small number of explicit types; this is the one instance where it really interferes. I'd be happy to have a better solution. Peace, Dylan

dpt@lotus.bostoncoop.net (Dylan Thurston) writes:
On Mon, Feb 28, 2005 at 01:45:25PM +0000, Malcolm Wallace wrote:
As examples of language features that should be removed or revised, how about these: (a) n+k patterns (b) the defaulting mechanism (c) the monomorphism restriction
(b) is annoying, but I don't see an alternative now. Removing the defaulting mechanism would be very backwards-incompatible, and require a lot of type signatures in annoying places.
Yes, I was thinking more of revision than removal here. The main problem with the current defaulting mechanism is that it is arbitrarily limited to certain classes. A better design would be more orthogonal, for instance allowing the programmer to define a lattice of class/type pairs to use as defaults. e.g. default Num Integer default Floating Double default Monad Maybe default Codec.Compressable Codec.Compress.Bzip Where the usage of a class method is ambiguous, the compiler would choose the least member of the lattice that satisfies all the superclass constraints. To make the design truly orthogonal however, it would probably be necessary to remove the "default" default of (Integer,Double), and instead perhaps allow default clauses to be exported/imported explicitly. I think something like this could be pretty-much backward compatible, as the semantics are very close to the current semantics (Report section 4.3.3), but without the restriction that Num be a superclass of the defaulted type. The revised Prelude would simply export the Num=Integer, Floating=Double default bindings, and the minor change of syntax elsewhere would be manageable. Regards, Malcolm

ross@soi.city.ac.uk wrote:
On Sat, Feb 26, 2005 at 08:00:22PM -0500, ajb@spamcop.net wrote:
Quoting ross@soi.city.ac.uk:
Indeed. Of all the extensions implemented by both GHC and Hugs, the only ones that seem ready are
- rank 2 type signatures, and
- polymorphic components for data constructors (giving them rank 2 types).
Off the top of my head:
- multi-parameter type classes
Reasonable in themselves, but limited in usefulness without some scheme to deal with overlapping instances, which doesn't seem settled at this time.
In the HList paper we show that any overlapping instance can be eliminated with the use of a single constraint (that would have to be provided by the compiler: TypeEq). Further I think functional dependancies (without overlapping instances), and undecidable instances are perfectly well understood. (incoherent instances would join overlapping instances and not be included). So I would add: - Multi-parameter type classes with functional dependancies (with no overlapping instances allowed) Keean.

On Mon, Feb 28, 2005 at 11:25:26AM +0000, Keean Schupke wrote:
Further I think functional dependancies (without overlapping instances), and undecidable instances are perfectly well understood. (incoherent instances would join overlapping instances and not be included).
Where can I read this understanding? The GHC docs on FDs are terse, referring to the original paper, but that paper is somewhat informal, and describes a weaker system than what is implemented. For example, (taken from s7 of the paper), given class U a b | a -> b where u :: a -> b class U a b => V a b where v :: a -> b the paper says that the principal type of \x -> (u x, v x) is (U a b, V a c) => a -> (b,c) According to GHC and Hugs, it is V a b => a -> (b,b) I think that's sensible, but where are the rules that give it? I suspect that writing this addendum may take a while.

Ross Paterson wrote:
Where can I read this understanding? The GHC docs on FDs are terse, referring to the original paper, but that paper is somewhat informal, and describes a weaker system than what is implemented. For example, (taken from s7 of the paper), given
class U a b | a -> b where u :: a -> b class U a b => V a b where v :: a -> b
the paper says that the principal type of \x -> (u x, v x) is
(U a b, V a c) => a -> (b,c)
According to GHC and Hugs, it is
V a b => a -> (b,b)
I think that's sensible, but where are the rules that give it? I suspect that writing this addendum may take a while.
Thats just a type simplification, in the source, we have \x -> (u x, v x) the type checker knows both x's are the same, and it knows that (V a b) is the same as (U a b)... So in this case it looks like the second type is just a simplification of the first... so they really agree on the type, not disagree. Keean.

On Mon, Feb 28, 2005 at 12:40:42PM +0000, Keean Schupke wrote:
Ross Paterson wrote:
Where can I read this understanding? The GHC docs on FDs are terse, referring to the original paper, but that paper is somewhat informal, and describes a weaker system than what is implemented. For example, (taken from s7 of the paper), given
class U a b | a -> b where u :: a -> b class U a b => V a b where v :: a -> b
the paper says that the principal type of \x -> (u x, v x) is
(U a b, V a c) => a -> (b,c)
According to GHC and Hugs, it is
V a b => a -> (b,b)
I think that's sensible, but where are the rules that give it? I suspect that writing this addendum may take a while.
Thats just a type simplification, in the source, we have
\x -> (u x, v x)
the type checker knows both x's are the same, and it knows that (V a b) is the same as (U a b)...
So in this case it looks like the second type is just a simplification of the first... so they really agree on the type, not disagree.
No, the paper is clear on this point: "For example, given two predicates U a b and V a c, nothing in the rules from Section 6 will allow us to infer that b = c." I agree that they should be identified, but the type system that does it isn't written down anywhere (outside of the GHC and Hugs sources, and the Hugs version has a number of bugs).

Ross Paterson wrote:
On Mon, Feb 28, 2005 at 12:40:42PM +0000, Keean Schupke wrote:
Ross Paterson wrote:
Where can I read this understanding? The GHC docs on FDs are terse, referring to the original paper, but that paper is somewhat informal, and describes a weaker system than what is implemented. For example, (taken from s7 of the paper), given
class U a b | a -> b where u :: a -> b class U a b => V a b where v :: a -> b
the paper says that the principal type of \x -> (u x, v x) is
(U a b, V a c) => a -> (b,c)
According to GHC and Hugs, it is
V a b => a -> (b,b)
I think that's sensible, but where are the rules that give it? I suspect that writing this addendum may take a while.
Thats just a type simplification, in the source, we have
\x -> (u x, v x)
the type checker knows both x's are the same, and it knows that (V a b) is the same as (U a b)...
So in this case it looks like the second type is just a simplification of the first... so they really agree on the type, not disagree.
No, the paper is clear on this point: "For example, given two predicates U a b and V a c, nothing in the rules from Section 6 will allow us to infer that b = c."
I agree that they should be identified, but the type system that does it isn't written down anywhere (outside of the GHC and Hugs sources, and the Hugs version has a number of bugs).
Sounds like someone ought to write a new paper? Keean.
participants (15)
-
ajb@spamcop.net
-
Andrew Pimlott
-
dpt@lotus.bostoncoop.net
-
Henning Thielemann
-
Jan-Willem Maessen
-
John Meacham
-
John Peterson
-
Jon Fairbairn
-
Keean Schupke
-
Keith Wansbrough
-
Malcolm Wallace
-
Ross Paterson
-
ross@soi.city.ac.uk
-
Seth Kurtzberg
-
Sven Panne