
i would categorize myself as a purely practical programmer. i enjoy using haskell for various practical tasks and it has served me reliably. one issue i have with the library support for practical problem domains is the half-finished state of many fundamental codebases such as networking and database support. in the perl world, support for these domains is provided through cpan, and this model is viable due to the (once) massive number of perl coders out there. in the java, c# etc world, a "batteries included" approach implies a narrowing of options, but also an immediate delivery of functionality. so far the haskell community has taken the cpan route for most practical libs but i wonder if a "batteries included" approach might help get some key libraries to a more complete state. in particular, i would like to see support for basic internet protocols, database connectivity, and potentially xml parser support rolled into the ghc standard libs. there is always a strong debate on where the line is drawn, but this functionality at least shows up in a plurality of practical projects. the "batteries included" approach does imply choosing preferred solutions when more than one library is available, this can also be difficult. that said, i think haskell would pick up a lot of new coders if it was obvious that the functionality they were looking for came out of the base libs. i know that people will say they don't use a database or xml, but there will always be parts of a standard library that any particular coder will never touch...but still see value in the inclusion for others. comments?

On Nov 19, 2007 10:25 AM, brad clawsie
so far the haskell community has taken the cpan route for most practical libs but i wonder if a "batteries included" approach might help get some key libraries to a more complete state. in particular, i would like to see support for basic internet protocols, database connectivity, and potentially xml parser support rolled into the ghc standard libs. there is always a strong debate on where the line is
I agree strongly. I particularly miss a "standard" HTTP library. Justin

Batteries included, I could take it or leave it.
Where I think hackage could really benefit from copying perl strategy is
automated testing of *all* packages under hackage darcs, not just blessed
packages.
If this could be integrated into the buildbot of whatever ghc is under
development that would be great. You would then have status reports on ghc
itself, ghc plus extralibs (which the haskell core maintainers feel some
degree of responsibility for), and "hackage universe" which the core
people aren't responsible for, but the haskell community benefits from
feedback. Better yet would be feedback per ghc version, and per platform
(ubuntu, red hat, windows, on and on, whoever volunteers a test box for
the buildbot)
For example, I just found out, it seems that HDBC-ODBC is broken on
windows for ghc-6.8. It would be great to know this in advance before
trying to use it. Package maintainers could get automated emails too if
they want.
testing could be as basic as cabal install runs without errors, but could
also include additional quickcheck tests or other types of test harness.
That would be nice for the community and the core devs, I think.
thomas.
"Justin Bailey"
so far the haskell community has taken the cpan route for most practical libs but i wonder if a "batteries included" approach might help get some key libraries to a more complete state. in particular, i would like to see support for basic internet protocols, database connectivity, and potentially xml parser support rolled into the ghc standard libs. there is always a strong debate on where the line is
I agree strongly. I particularly miss a "standard" HTTP library. Justin _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe --- This e-mail may contain confidential and/or privileged information. If you are not the intended recipient (or have received this e-mail in error) please notify the sender immediately and destroy this e-mail. Any unauthorized copying, disclosure or distribution of the material in this e-mail is strictly forbidden.

Justin Bailey wrote:
On Nov 19, 2007 10:25 AM, brad clawsie
wrote: so far the haskell community has taken the cpan route for most practical libs but i wonder if a "batteries included" approach might help get some key libraries to a more complete state. in particular, i would like to see support for basic internet protocols, database connectivity, and potentially xml parser support rolled into the ghc standard libs. there is always a strong debate on where the line is
I agree strongly. I particularly miss a "standard" HTTP library.
Personally, I miss clean binary I/O, configurable character encodings, and an "easy" API for working with bitmapped images (loading them, saving them, displaying them, etc.) I could also reel off a whole bunch of other stuff I'd like to have - but that doesn't write the code, does it? Hackage seems like a nice idea in principle. However, - The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more). - Almost all packages seem to require a long list of dependencies. - There seems to be an awful lot of packages that do the same thing but with incompatible interfaces (and varying limitations). It seems we're not very coordinated here. - (And, since I'm on Windows, I can't seem to get anything to install with Cabal...) Unfortunately, while it's very easy to point out failings in a given system, it's much harder to propose viable ways to fix things... :-(

Hi
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
- Almost all packages seem to require a long list of dependencies.
Cabal-install will turn this from being a negative to being a positive, if it ever delivers on its promise.
- There seems to be an awful lot of packages that do the same thing but with incompatible interfaces (and varying limitations). It seems we're not very coordinated here.
Variety is good. Hopefully at some point people will start to standardise. For example, there are at least 4 libraries for working with HTML (TagSoup, HaXml, HXT, ...brain freeze...) - eventually someone will write a nice summary tutorial on when to use which one. In Haskell the interface is usually the most important bit, so making different libraries use the same interface eliminates their advantages.
- (And, since I'm on Windows, I can't seem to get anything to install with Cabal...)
Windows, the Operating System no one in the Haskell community loves... Make sure you point all the bugs and even little annoyances that you encounter, and hopefully things will head in the right direction. Thanks Neil

ndmitchell:
Hi
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
- Almost all packages seem to require a long list of dependencies.
Cabal-install will turn this from being a negative to being a positive, if it ever delivers on its promise.
Works for me here: cabal install xmonad or cabal install stream-fusion It's beta: needs darcs Cabal (1.3.x), and darcs cabal-install. We're using it at Galois already though, so I would encourage more testing/users. -- Don

The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
i would just like to add that i have had a great deal of success with hackage and find that most libraries support what they say they will support, but often there is missing functionality that the original authors have not attended to for some reason. bugs haven't impacted me as much as missing/incomplete features. by rolling certain libraries into a base distribution, i was implying that there would be more eyeballs focusing on making them feature-complete. furthermore, by closely associating these libraries into a base distribution, there will be a sense of urgency associated with closing major bugs. in any case, batteries included or not, ghc seems to have reached a point of stability, high performance, and lots of neat fundamental features that it can be left alone for a short time. i would love to see 2008 be the year we direct time and effort to solve filling holes in the libraries. perhaps an online tool for voting for missing libraries or features would help us assess where to direct efforts.

brad clawsie wrote:
in any case, batteries included or not, ghc seems to have reached a point of stability, high performance, and lots of neat fundamental features that it can be left alone for a short time. i would love to see 2008 be the year we direct time and effort to solve filling holes in the libraries.
I would love to be skilled enough to actually help fix these "holes" rather than just complaining about them... Sadly, I'm not. I've got an MD5 library I'm working on, and I've got a thin layer over Gtk2hs that should make writing programs that render bitmaps easier. (E.g., ray tracers, fractal generators... all the kinds of things I like writing!) If I ever manage to mangle those into a working state, I'll happily hand 'em over. But that's not going to make a huge difference to the HackageDB as a whole...

2007/11/19, brad clawsie
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
by rolling certain libraries into a base distribution, i was implying that there would be more eyeballs focusing on making them feature-complete. furthermore, by closely associating these libraries into a base distribution, there will be a sense of urgency associated with closing major bugs.
If you look at the stability tag of ghc libraries you will see that a lot of them are marked as "provisional" (Network.URI for example) or "experimental" (Control.Monad.Trans). Although I would love to see some other standard libraries (MaybeT !), I think that current base should be solid first. Just My $0.02 , Radek. -- Codeside: http://codeside.org/ Przedszkole Miejskie nr 86 w Lodzi: http://www.pm86.pl/

On Mon, 2007-11-19 at 21:47 +0100, Radosław Grzanka wrote:
2007/11/19, brad clawsie
: The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
by rolling certain libraries into a base distribution, i was implying that there would be more eyeballs focusing on making them feature-complete. furthermore, by closely associating these libraries into a base distribution, there will be a sense of urgency associated with closing major bugs.
If you look at the stability tag of ghc libraries you will see that a lot of them are marked as "provisional" (Network.URI for example) or "experimental" (Control.Monad.Trans). Although I would love to see some other standard libraries (MaybeT !), I think that current base should be solid first.
On the other hand, some of these (Control.Monad.Trans) have been "experimental" for several years despite being widely used that whole time...

On Nov 19, 2007, at 15:47 , Radosław Grzanka wrote:
If you look at the stability tag of ghc libraries you will see that a lot of them are marked as "provisional" (Network.URI for example) or "experimental" (Control.Monad.Trans).
This may not refer to what most people care about; the "experimental" stability of Control.Monad.Trans is related to its use of fundeps and undecidable instances, and the possibility (likelihood?) of its being switched to type families (which "shouldn't" change its user-visible interface, as I understand it). -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

On Mon, 19 Nov 2007, Brandon S. Allbery KF8NH wrote:
On Nov 19, 2007, at 15:47 , Radosław Grzanka wrote:
If you look at the stability tag of ghc libraries you will see that a lot of them are marked as "provisional" (Network.URI for example) or "experimental" (Control.Monad.Trans).
This may not refer to what most people care about; the "experimental" stability of Control.Monad.Trans is related to its use of fundeps and undecidable instances, and the possibility (likelihood?) of its being switched to type families (which "shouldn't" change its user-visible interface, as I understand it).
I like to see MTL split into a Haskell98 part and an advanced part. I mostly use functionality which would nicely fit into a Haskell98 interface and find it annoying that by importing MTL my code becomes less portable.

On Nov 19, 2007, at 23:13 , Henning Thielemann wrote:
On Mon, 19 Nov 2007, Brandon S. Allbery KF8NH wrote:
On Nov 19, 2007, at 15:47 , Radosław Grzanka wrote:
If you look at the stability tag of ghc libraries you will see that a lot of them are marked as "provisional" (Network.URI for example) or "experimental" (Control.Monad.Trans).
This may not refer to what most people care about; the "experimental" stability of Control.Monad.Trans is related to its use of fundeps and undecidable instances, and the possibility (likelihood?) of its being switched to type families (which "shouldn't" change its user-visible interface, as I understand it).
I like to see MTL split into a Haskell98 part and an advanced part. I mostly use functionality which would nicely fit into a Haskell98 interface and find it annoying that by importing MTL my code becomes less portable.
Yes! Please! /Björn

Neil Mitchell wrote:
Hi
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
Yes, perhaps so...
- Almost all packages seem to require a long list of dependencies.
Cabal-install will turn this from being a negative to being a positive, if it ever delivers on its promise.
Here's to hoping. ;-)
- There seems to be an awful lot of packages that do the same thing but with incompatible interfaces (and varying limitations). It seems we're not very coordinated here.
Variety is good. Hopefully at some point people will start to standardise. For example, there are at least 4 libraries for working with HTML (TagSoup, HaXml, HXT, ...brain freeze...) - eventually someone will write a nice summary tutorial on when to use which one. In Haskell the interface is usually the most important bit, so making different libraries use the same interface eliminates their advantages.
Variety is good. Standardisation is also good (for different reasons). Having half a dozen database access libraries (each of which only talks to certain databases) is just confusing. I suppose the key is to find a balance between having lots of choice and knowing which thing to choose. (Also, when somebody writes a library, it's dependencies are going to be the author's choice. Not much fun trying to use two libraries that both depend on different, incompatible "binary" packages...)
Windows, the Operating System no one in the Haskell community loves... Make sure you point all the bugs and even little annoyances that you encounter, and hopefully things will head in the right direction.
Well, I've already filed 4 bugs against GHC. One was already fixed by GHC 6.8.1 (yays!), one is trivial and will be fixed in 6.8.2, and the other two it seems nobody is keen to work on. (In fairness, one of them is fairly nontrivial.) I get the impression that I'd probably be regarded as a pest if I just spent all day filing endless bug reports... It would be quite nice if rather than just filing reports, I could do something useful to help *fix* these bugs. But, unfortunately, that is beyond my skill. (On the other hand, even things that I should theoretically be able to do I haven't managed to. You might remember a while back I offered to try to spruce up the Haddoc documentation for Parsec. It has a great user manual, but the Haddoc reference is Spartan. Well anyway, in the end I couldn't figure out how to do that, so nothing got done...)

| Well, I've already filed 4 bugs against GHC. One was already fixed by | GHC 6.8.1 (yays!), one is trivial and will be fixed in 6.8.2, and the | other two it seems nobody is keen to work on. (In fairness, one of them | is fairly nontrivial.) I get the impression that I'd probably be | regarded as a pest if I just spent all day filing endless bug reports... No, you would emphatically not be regarded as a pest. We _really like_ well-characterised bug reports, most especially if they come with reproducible test cases. Please go ahead and submit them. (Caveat: pls try to check that there isn't an already-open report on the bug. Granted, you can't do a 100% perfect job on this.) | (On the other hand, even things that I should theoretically be able to | do I haven't managed to. You might remember a while back I offered to | try to spruce up the Haddoc documentation for Parsec. It has a great | user manual, but the Haddoc reference is Spartan. Well anyway, in the | end I couldn't figure out how to do that, so nothing got done...) Did you ask Haskell-Cafe? I bet you'd get help with whatever you are stuck on. Simon

Simon Peyton-Jones wrote:
| Well, I've already filed 4 bugs against GHC. One was already fixed by | GHC 6.8.1 (yays!), one is trivial and will be fixed in 6.8.2, and the | other two it seems nobody is keen to work on. (In fairness, one of them | is fairly nontrivial.) I get the impression that I'd probably be | regarded as a pest if I just spent all day filing endless bug reports...
No, you would emphatically not be regarded as a pest. We _really like_ well-characterised bug reports, most especially if they come with reproducible test cases. Please go ahead and submit them. (Caveat: pls try to check that there isn't an already-open report on the bug. Granted, you can't do a 100% perfect job on this.)
Well, let me see. So far I've filed 4 bugs: http://hackage.haskell.org/trac/ghc/ticket/1869 [It's already fixed. Yays!] http://hackage.haskell.org/trac/ghc/ticket/1891 [Yeah, we missed that one, we'll fix it.] http://hackage.haskell.org/trac/ghc/ticket/1874 [We'll maybe fix it; we're not keen.] http://hackage.haskell.org/trac/ghc/ticket/1868 [We're not sure how to fix this.] [In fairness, neither am I now!] It seems you guys like simple easy things that are obviously bugs. (I thought 1874 was "clearly" a bug and should be fixed, but hey.) And it seems you are much less keen on more fuzzy things where it's not clear what The Right Thing(tm) actually is. (I guess that's understandable.) I guess *every* bugfix is easy for the person who doesn't have to write it. ;-) (BTW, does anybody other than igloo look at these? He must be one *really* busy guy!) Lest it seem like all I do is sit here and complain... I did actually make a serious attempt to fix 1874 myself. But I couldn't figure out where to look. In the end, I basically just gave up because I'm not sure what I'm doing.
| (On the other hand, even things that I should theoretically be able to | do I haven't managed to. You might remember a while back I offered to | try to spruce up the Haddoc documentation for Parsec. It has a great | user manual, but the Haddoc reference is Spartan. Well anyway, in the | end I couldn't figure out how to do that, so nothing got done...)
Did you ask Haskell-Cafe? I bet you'd get help with whatever you are stuck on.
I believe I did, and I don't recall getting an answer. I suppose the only way to know is to have a wade through the list archives. (It was way too long ago for the message to be in my mailbox any more!)

On Nov 19, 2007, at 15:13 , Neil Mitchell wrote:
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
The ability to "vote" on packages might be interesting here. If there's 4 HTML libraries and one of them gets lots of votes, it's probably the one to look at first. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

Brandon S. Allbery KF8NH wrote:
On Nov 19, 2007, at 15:13 , Neil Mitchell wrote:
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
The ability to "vote" on packages might be interesting here. If there's 4 HTML libraries and one of them gets lots of votes, it's probably the one to look at first.
It occurred to me that the voting could be implicit. That is, if 10 libraries/programs use library X, then library X gets 10 votes. Kind of like Google PageRank for libraries. Greetings, Mads Lindstrøm

On Nov 19, 2007, at 17:01 , Mads Lindstrøm wrote:
Brandon S. Allbery KF8NH wrote:
The ability to "vote" on packages might be interesting here. If there's 4 HTML libraries and one of them gets lots of votes, it's probably the one to look at first.
It occurred to me that the voting could be implicit. That is, if 10 libraries/programs use library X, then library X gets 10 votes. Kind of like Google PageRank for libraries.
Only up to a point; not all programs written using such libraries are necessarily going to end up on hackage. (Consider the code written by the financials folks that have been mentioned here various times; and I have a couple programs which would be quite useless outside of CMU ECE because they operate on parts of our site-specific infrastructure.) It would certainly be a start, though. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

"Brandon S. Allbery KF8NH"
Kind of like Google PageRank for libraries.
Yes.
Only up to a point; not all programs written using such libraries are necessarily going to end up on hackage. (Consider the code written by the financials folks that have been mentioned here various times;
I don't see that as a problem -- if you don't contribute, you don't get to vote. -k -- If I haven't seen further, it is by standing in the footprints of giants

On Nov 20, 2007, at 3:25 , Ketil Malde wrote:
"Brandon S. Allbery KF8NH"
writes: Only up to a point; not all programs written using such libraries are necessarily going to end up on hackage. (Consider the code written by the financials folks that have been mentioned here various times;
I don't see that as a problem -- if you don't contribute, you don't get to vote.
Well, except it's not the person who has been disenfranchised who is losing here; it's the folks who might benefit from their experience using it. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

Hello Brandon, Tuesday, November 20, 2007, 1:15:34 AM, you wrote:
The ability to "vote" on packages might be interesting here. If there's 4 HTML libraries and one of them gets lots of votes, it's probably the one to look at first.
it can be made easy and automatic by just publishing "number of downloads" on hackage what hackage developers will say? -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

On Tue, 2007-11-20 at 13:45 +0300, Bulat Ziganshin wrote:
Hello Brandon,
Tuesday, November 20, 2007, 1:15:34 AM, you wrote:
The ability to "vote" on packages might be interesting here. If there's 4 HTML libraries and one of them gets lots of votes, it's probably the one to look at first.
it can be made easy and automatic by just publishing "number of downloads" on hackage
what hackage developers will say?
Yes please! Please contribute the feature. Grab the hackage code from: http://darcs.haskell.org/hackage-scripts/ Send patches to the cabal-devel mailing list. Everyone is most welcome to subscribe too. Another thing I'd like to see coming out of this discussion is some feature requests filed in the hackage trac with a summary of some of our conclusions: http://hackage.haskell.org/trac/hackage/ Duncan

Duncan Coutts wrote:
Grab the hackage code from:
http://darcs.haskell.org/hackage-scripts/
Send patches to the cabal-devel mailing list. Everyone is most welcome to subscribe too.
So... the HackageDB HTTP frontend is just a set of CGI scripts written in Haskell? (As far as I can tell, the HTTP server itself is just Apache.) I was worrying it might be PHP or something. If it's just Haskell maybe I'll go take a look at it...

On Nov 20, 2007, at 5:45 , Bulat Ziganshin wrote:
Hello Brandon,
Tuesday, November 20, 2007, 1:15:34 AM, you wrote:
The ability to "vote" on packages might be interesting here. If there's 4 HTML libraries and one of them gets lots of votes, it's probably the one to look at first.
it can be made easy and automatic by just publishing "number of downloads" on hackage
So if I download all 4 HTML libs to try to figure out which one fits best, I mod all four up? Seems wrong to me. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

Brandon S. Allbery KF8NH wrote:
On Nov 20, 2007, at 5:45 , Bulat Ziganshin wrote:
it can be made easy and automatic by just publishing "number of downloads" on hackage
So if I download all 4 HTML libs to try to figure out which one fits best, I mod all four up? Seems wrong to me.
Also seems to have a tendancy to make older libs look "better" than newer ones. (They will have been downloaded more just because they've been there longer.) I think download stats would be nice, but having a way for users to comment (and manually select a rating) would be nice too. Why not do all of them? :-D (Oh yeah - somebody still needs to write the code...)

On Mon, 19 Nov 2007, Mads [ISO-8859-1] Lindstrøm wrote:
It occurred to me that the voting could be implicit. That is, if 10 libraries/programs use library X, then library X gets 10 votes. Kind of like Google PageRank for libraries.
It would be good if users could comment verbally. They could comment like "efficient implementation, but weakly typed interface", "very general but not very well documented". One cannot express everything with a scalar popularity value.

the php documentation has "user contributed notes" where people can leave
sniplets of useful code as comments, eg
http://www.php.net/manual/en/introduction.php
I think this is a very nice feature.
Henning Thielemann
It occurred to me that the voting could be implicit. That is, if 10 libraries/programs use library X, then library X gets 10 votes. Kind of like Google PageRank for libraries.
It would be good if users could comment verbally. They could comment like "efficient implementation, but weakly typed interface", "very general but not very well documented". One cannot express everything with a scalar popularity value. _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe --- This e-mail may contain confidential and/or privileged information. If you are not the intended recipient (or have received this e-mail in error) please notify the sender immediately and destroy this e-mail. Any unauthorized copying, disclosure or distribution of the material in this e-mail is strictly forbidden.

On Mon, Nov 19, 2007 at 05:27:30PM -0500, Thomas Hartman wrote:
the php documentation has "user contributed notes" where people can leave sniplets of useful code as comments, eg
http://www.php.net/manual/en/introduction.php
I think this is a very nice feature.
yup, for php it gives users a chance to suggest corrections for the numerous errors and omissions in the standard docs...:)

On Nov 19, 2007 11:27 PM, Thomas Hartman
the php documentation has "user contributed notes" where people can leave sniplets of useful code as comments, eg
I think this is a very nice feature.
I would love to have this on haskell, especially because the documentation often lack example(s)

| > the php documentation has "user contributed notes" where people can leave | > sniplets of useful code as comments, eg | | > http://www.php.net/manual/en/introduction.php | | > I think this is a very nice feature. | | I would love to have this on haskell, especially because the | documentation often lack example(s) We've discussed this a couple of times at GHC HQ, at least in relation to GHC's user manual and library documentation. It's a *great* idea, because it allows everyone to improve the documentation. But we're just not sure how to do it: * What technology to use? * Matching up the note-adding technology with the existing infrastructure - GHC's user manual starts as XML and is generated into HTML by DocBook - In contrast, the library documentation is generated by Haddock. * Hardest of all: evolution. Both GHC's user manual and library docs change every release. Even material that doesn't change can get moved (e.g. section reorganisation). We don't want to simply discard all user notes! But it's hard to know how to keep them attached; after all they may no longer even be relevant. They almost certainly don't belong in the source-code control system. If someone out there knows solutions to these challenges, and would like to help implement them, we'd love to hear from you. Accurate documentation, with rich cross-links (e.g. to source code), and opportunities for the community to elaborate it, is a real challenge for a language the size of Haskell and its libraries. Simon

Simon Peyton-Jones
* Hardest of all: evolution. Both GHC's user manual and library docs change every release. Even material that doesn't change can get moved (e.g. section reorganisation). We don't want to simply discard all user notes! But it's hard to know how to keep them attached; after all they may no longer even be relevant. They almost certainly don't belong in the source-code control system.
I've had a discussion with AltLinux team members, which are trying to solve the similar problem: they need to mark user-provided content in their wiki as "obsolete" as the time passes. Solution we've found so far looks like this: - Author of comment may change it anytime. - Every comment has 'added/modified for version X.Y' tag, which is changed to the current version when author changes the comment (probably just saying "still valid for new version" in interface). This scheme may be too restrictive and can be relaxed, so not only author may change the comments, but other users too. -- JID: dottedmag@jabber.dottedmag.net

Hi,
* Hardest of all: evolution. Both GHC's user manual and library docs change every release. Even material that doesn't change can get moved (e.g. section reorganisation). We don't want to simply discard all user notes! But it's hard to know how to keep them attached; after all they may no longer even be relevant. They almost certainly don't belong in the source-code control system.
Just random idea: Voting for notes (vote for 'inaccurate', 'deprecated' etc.)? Something along the lines of social sites like digg.com or others? The comments which gets to some threshold could be simply hidden by default because probably they are inaccurate or deprecated? And then clean up bot comes once in a while etc. etc. Ofcourse this would work for online docs only. Thanks, Radek. -- Codeside: http://codeside.org/ Przedszkole Miejskie nr 86 w Lodzi: http://www.pm86.pl/

| > the php documentation has "user contributed notes" | > http://www.php.net/manual/en/introduction.php | > I think this is a very nice feature.
* Hardest of all: evolution. Both GHC's user manual and library docs change every release. Even material that doesn't change can get moved (e.g. section reorganisation). We don't want to simply discard all user notes!
Just for comparison: PostgreSQL's site has a similar feature (http://www.postgresql.org/docs/manuals/). However, they don't appear to migrate user notes between documentation versions e.g. these pages from 8.1 have comments: http://www.postgresql.org/docs/8.1/interactive/rules.html http://www.postgresql.org/docs/8.1/interactive/user-manag.html the equivalent pages from 8.0 have no/different comments: http://www.postgresql.org/docs/8.0/interactive/rules.html http://www.postgresql.org/docs/8.0/interactive/user-manag.html So that could be a starting point, but it does seem a bit unsatisfactory. Alistair ***************************************************************** Confidentiality Note: The information contained in this message, and any attachments, may contain confidential and/or privileged material. It is intended solely for the person(s) or entity to which it is addressed. Any review, retransmission, dissemination, or taking of any action in reliance upon this information by persons or entities other than the intended recipient(s) is prohibited. If you received this in error, please contact the sender and delete the material from any computer. *****************************************************************

Simon Peyton-Jones wrote:
| > the php documentation has "user contributed notes" where people can leave | > sniplets of useful code as comments, eg | | > http://www.php.net/manual/en/introduction.php | | > I think this is a very nice feature. | | I would love to have this on haskell, especially because the | documentation often lack example(s)
We've discussed this a couple of times at GHC HQ, at least in relation to GHC's user manual and library documentation. It's a *great* idea, because it allows everyone to improve the documentation.
But we're just not sure how to do it:
* What technology to use?
* Matching up the note-adding technology with the existing infrastructure - GHC's user manual starts as XML and is generated into HTML by DocBook - In contrast, the library documentation is generated by Haddock.
* Hardest of all: evolution. Both GHC's user manual and library docs change every release. Even material that doesn't change can get moved (e.g. section reorganisation). We don't want to simply discard all user notes! But it's hard to know how to keep them attached; after all they may no longer even be relevant. They almost certainly don't belong in the source-code control system.
If someone out there knows solutions to these challenges, and would like to help implement them, we'd love to hear from you. Accurate documentation, with rich cross-links (e.g. to source code), and opportunities for the community to elaborate it, is a real challenge for a language the size of Haskell and its libraries.
What technology to use, that's the *key* question. If we forget everything what we currently can do with a computer and instead imagine what we could do, the answer would probably be: The documentation / source code can be edited directly while viewing it (i.e. Wiki + WYSIWYG). Moreover, it's possible to attach lots of Post-It® notes to sections / paragraphs / sentences with scribbled comments / questions / remarks about content / administrative tasks. Those notes can be hidden to get a clean view. A wiki is rather centralized, so a form of decentralization / version control à la darcs is needed, at least for some parts like the source code. Last but not least, there's a tension between quality and "editable by everyone", so some form of access control is mandatory and further means to ensure quality are needed, that's the hard part. The above ideal is entirely realizable, just not with existing technology like web-browsers / text editors . For instance, it's desirable to be able to edit source / haddock with a text editor like right now. But one would also like to edit it right in the (generalized) web-browser. Ideally, one could just pipe the underlying document through a lens data Lens s a = Lens { get :: s -> a; set :: a -> (s -> s); } text :: Lens HaskellDocument ASCII browser :: Lens HaskellDocument Html so that the edits in the view are reflected in the document. (Same for IDEs or GUIs or whatever). Regards, apfelmus

Hello apfelmus, Tuesday, November 20, 2007, 1:10:26 PM, you wrote:
What technology to use, that's the *key* question. If we forget everything what we currently can do with a computer and instead imagine what we could do, the answer would probably be:
the system you descriibed can be made possible with online editors such as Zoho or Google Docs -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

On Tue, Nov 20, 2007 at 08:55:47AM +0000, Simon Peyton-Jones wrote:
But we're just not sure how to do it:
* What technology to use?
* Matching up the note-adding technology with the existing infrastructure - GHC's user manual starts as XML and is generated into HTML by DocBook - In contrast, the library documentation is generated by Haddock.
I would advocate using a comment system that is similar to the one at http://djangobook.com/. In terms of user manual comments might be attached to sections and paragraphs in the document. Haddock already generates HTML anchors for every type, variable and class, so these are good candidates to attach user comments to.
* Hardest of all: evolution. Both GHC's user manual and library docs change every release. Even material that doesn't change can get moved (e.g. section reorganisation). We don't want to simply discard all user notes! But it's hard to know how to keep them attached; after all they may no longer even be relevant. They almost certainly don't belong in the source-code control system.
Comments in both html user guide and library docs could be easily cross-referenced to specific parts of docbook/haskell source. The remaining part (and I admit, labour intensive) would be to take the notes into consideration while updating the documentation for the next release. This doesn't happen too often (once a year? but if I'm wrong please tell me) and I guess the whole point of accepting user's comments is to improve the dock - that is to let writers address the issues in the next version. Now, examples illustrating use of library functions - that's a different story... Regards, -- Krzysztof Kościuszkiewicz Skype: dr.vee, Gadu: 111851, Jabber: kokr@jabberpl.org "Simplicity is the ultimate sophistication" -- Leonardo da Vinci

On Tue, 2007-11-20 at 12:33 +0000, Krzysztof Kościuszkiewicz wrote:
On Tue, Nov 20, 2007 at 08:55:47AM +0000, Simon Peyton-Jones wrote:
But we're just not sure how to do it:
* What technology to use?
* Matching up the note-adding technology with the existing infrastructure - GHC's user manual starts as XML and is generated into HTML by DocBook - In contrast, the library documentation is generated by Haddock.
I would advocate using a comment system that is similar to the one at http://djangobook.com/. In terms of user manual comments might be attached to sections and paragraphs in the document. Haddock already generates HTML anchors for every type, variable and class, so these are good candidates to attach user comments to.
I'm pretty sure Brian O'Sullivan has written a Haskell implementation of this for the Real World Haskell book. I guess, they will publish the source once the book is out. Maybe the RWH guys can give some feedback on how this works, they seem to have very similar goals. / Thomas

Thomas Schilling
I would advocate using a comment system that is similar to the one at http://djangobook.com/.
I'm pretty sure Brian O'Sullivan has written a Haskell implementation of this for the Real World Haskell book.
While the technology is there (or will be), I worry if this is the right solution for something else than soliciting comments on a (fixed, non-editable) text. I can all to easily imagine a situation where any documentation is riddled with a plethora of notes, questions, answers, comments etc, with nobody to clean up the mess every now and then. For user-edited documentation, a wiki seems a much better fit - where each author make some effort to leave pages as self-contained consistent documents. -k -- If I haven't seen further, it is by standing in the footprints of giants

On Tue, 2007-11-20 at 16:00 +0100, Ketil Malde wrote:
Thomas Schilling
writes: I would advocate using a comment system that is similar to the one at http://djangobook.com/.
I'm pretty sure Brian O'Sullivan has written a Haskell implementation of this for the Real World Haskell book.
While the technology is there (or will be), I worry if this is the right solution for something else than soliciting comments on a (fixed, non-editable) text.
I can all to easily imagine a situation where any documentation is riddled with a plethora of notes, questions, answers, comments etc, with nobody to clean up the mess every now and then. For user-edited documentation, a wiki seems a much better fit - where each author make some effort to leave pages as self-contained consistent documents.
Hm. The GHC user's guide currently is generated from a DocBook (XML-based) language, but when I extended the Cabal documentation (which also is DocBook) I wasn't very impressed by DocBook. It isn't particularly well-documented and editing raw XML is never fun, even with the right Emacs mode. One could hope that a standard format would come with many tools, but I didn't get the impression that the tools are great, either. They're also not easy to set up. (On Mac OS I had to manually add a symlink to fix a script. Installation only worked with fink, not with Macports. I wouldn't be surprised if it were even harder to set up on Windows. Ubuntu worked fine, though.) Using DocBook, however, has some nice advantages. For example, the possibility to generate documentation in different formats. Something more easily accessible (from the internet) would certainly be much more convenient, though. It would be nice, though, to preserve semantic markup. Aren't there some usable web-based WYSIWYG editors that edit XML rather than HTML? Also, it should be possible to have branches in this wiki-system, so that you can associate it with a particular release, but still update when necessary. (I.e., just linking to a particular version is not sufficient.) Does anyone know of a system that comes close to those requirements? Do we need more features from DocBook for GHC or the libraries, or both?

On 11/20/07 7:35 AM, Thomas Schilling wrote:
On Tue, 2007-11-20 at 16:00 +0100, Ketil Malde wrote:
Thomas Schilling
writes: I can all to easily imagine a situation where any documentation is riddled with a plethora of notes, questions, answers, comments etc, with nobody to clean up the mess every now and then. For user-edited documentation, a wiki seems a much better fit - where each author make some effort to leave pages as self-contained consistent documents.
Hm. The GHC user's guide currently is generated from a DocBook (XML-based) language, but when I extended the Cabal documentation (which also is DocBook) I wasn't very impressed by DocBook. It isn't particularly well-documented
Hi, [Disclosure: I'm a large part of O'Reilly's re-adoption of DocBook internally and a member of the OASIS DocBook SubCommittee for Publishers] I'm particularly surprised by this last sentence on the lack of documentation, as the DocBook standard has a definitive, comprehensive, freely available manual at http://www.docbook.org/tdg/en/html/docbook.html that I've always found very usable. Were there particular things that you missed?
and editing raw XML is never fun, even with the right Emacs mode. One could hope that a standard format would come with many tools, but I didn't get the impression that the tools are great, either.
The state of GUI XML editors has advanced significantly over the last year with the continued work on both XXE (http://www.xmlmind.com/xmleditor/) and oXygen's latest release (http://www.oxygenxml.com/docbook_editor.html), for example. That said, there are not as many tools for editing DocBook XML as HTML, for example.
Using DocBook, however, has some nice advantages. For example, the possibility to generate documentation in different formats. Something more easily accessible (from the internet) would certainly be much more convenient, though. It would be nice, though, to preserve semantic markup. Aren't there some usable web-based WYSIWYG editors that edit XML rather than HTML?
Not that I've found. I'd be delighted to hear about possibilities. Most web-based attempts start with the desire of writing in a plain-text wiki language and getting to XML. These seem to always fail on complex markup (tables, nested lists, code with internal markup).
Do we need more features from DocBook for GHC or the libraries, or both?
I'd be delighted to help anyone interested in extending GHC's docs to allow the sort of flexible commenting system Simon has outlined. Please don't hesitate to contact me directly. Regards, Keith

On Tue, 2007-11-20 at 12:03 -0800, Keith Fahlgren wrote:
On 11/20/07 7:35 AM, Thomas Schilling wrote:
On Tue, 2007-11-20 at 16:00 +0100, Ketil Malde wrote:
Thomas Schilling
writes: I can all to easily imagine a situation where any documentation is riddled with a plethora of notes, questions, answers, comments etc, with nobody to clean up the mess every now and then. For user-edited documentation, a wiki seems a much better fit - where each author make some effort to leave pages as self-contained consistent documents.
Hm. The GHC user's guide currently is generated from a DocBook (XML-based) language, but when I extended the Cabal documentation (which also is DocBook) I wasn't very impressed by DocBook. It isn't particularly well-documented
Hi,
[Disclosure: I'm a large part of O'Reilly's re-adoption of DocBook internally and a member of the OASIS DocBook SubCommittee for Publishers]
I'm particularly surprised by this last sentence on the lack of documentation, as the DocBook standard has a definitive, comprehensive, freely available manual at http://www.docbook.org/tdg/en/html/docbook.html that I've always found very usable. Were there particular things that you missed?
Right. I should have been more specific. I certainly like the idea of Docbook. But in an open source project documentation is written in small parts and by many different people. I personally didn't care to read a whole book just to be able write a few pages of documentation. Thus I tried to use it as a reference. This worked reasonably well, but could have been a way more comfortable experience. Some quick-access / lookup table, would have been nicer. Maybe also a little more pretty than gray and standard link blue. (Even the W3C specs look rather nice.) My point is, for a casual editor trying to write or edit DocBook documents based on this book is rather tedious. I think my Emacs mode didn't do as nice completion as it should have (based on DTD and everything.)
and editing raw XML is never fun, even with the right Emacs mode. One could hope that a standard format would come with many tools, but I didn't get the impression that the tools are great, either.
The state of GUI XML editors has advanced significantly over the last year with the continued work on both XXE (http://www.xmlmind.com/xmleditor/) and oXygen's latest release (http://www.oxygenxml.com/docbook_editor.html), for example. That said, there are not as many tools for editing DocBook XML as HTML, for example.
The latter is not available for free (only trial). The former seems to be free for non-commercial use. I haven't tried either (*Java Runtime rant elided*). The real problem remains: Even if you were to install a special program to (reasonably) edit a DocBook file, we still don't have the immediacy of a Wiki.
Using DocBook, however, has some nice advantages. For example, the possibility to generate documentation in different formats. Something more easily accessible (from the internet) would certainly be much more convenient, though. It would be nice, though, to preserve semantic markup. Aren't there some usable web-based WYSIWYG editors that edit XML rather than HTML?
Not that I've found. I'd be delighted to hear about possibilities.
There seem to be some. But I could only find commercial ones.

Hello Thomas, Tuesday, November 20, 2007, 6:35:00 PM, you wrote:
Using DocBook, however, has some nice advantages. For example, the possibility to generate documentation in different formats. Something more easily accessible (from the internet) would certainly be much more convenient, though.
zoho writer: online, not xml editor, but at least able to export into pdf/html/doc/.. -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

On Wed, 2007-11-21 at 12:19 +0300, Bulat Ziganshin wrote:
Hello Thomas,
Tuesday, November 20, 2007, 6:35:00 PM, you wrote:
Using DocBook, however, has some nice advantages. For example, the possibility to generate documentation in different formats. Something more easily accessible (from the internet) would certainly be much more convenient, though.
zoho writer: online, not xml editor, but at least able to export into pdf/html/doc/..
It's not open source + it doesn't do what we need -> Bang! \also they host stuff for you, and only have limited room for free usage. Relying on a company is not the way to go here (think of BitKeeper). TinyMCE [1], however, seems like a good start. It is open source, seems relatively mature, supports plugins, and runs completely in JavaScript, thus should be independent from the server technology. I actually had something in mind like WYMeditor [2], but it seems not very mature, yet. [1] .. http://tinymce.moxiecode.com/ [2] .. http://www.wymeditor.org/en/

Hello Thomas, Wednesday, November 21, 2007, 6:30:17 PM, you wrote:
zoho writer: online, not xml editor, but at least able to export into pdf/html/doc/..
It's not open source + it doesn't do what we need -> Bang! \also they host stuff for you, and only have limited room for free usage. Relying on a company is not the way to go here (think of BitKeeper).
this is actually commercial service which provides some form of try-before-you-buy facility. GHC docs, naturally, are not the thing they will be happy to host :) but at least it's some example - online wysiwyg editor with html/pdf output which may show our way into future. as you've written, there are other online editors which are open-source and free so we will be desired "clients" for such service/software there is also google docs - providing less facilities that zoho; i can't compare it to other free variants neither don't know whether it's okay to use it for free focs -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

Krzysztof Kościuszkiewicz wrote:
I would advocate using a comment system that is similar to the one at http://djangobook.com/.
That's an appealing idea, but the devil lies in the details. I wrote just such a comment system for draft chapters of our book, and it's seen a lot of use. However, what I do is add ID tags to the DocBook source, and the XSLT processor passes those through to the final HTML. This isn't easily generalised to other tools, as each needs its own approach. An alternative is to embed identifiers in the generated HTML, but this is brittle in its own way. Few people generate HTML by hand, and most tools that do so have a habit of making huge changes to the output structure in response to minor user edits. Stably identifying chunks of text across multiple versions of a document is thus somewhat fiddly.

Neil Mitchell wrote:
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
Not necessarily. CPAN has a nice voting system for packages, which is quite widely used. Another useful proxy for quality that CPAN is missing is download statistics. The maintainers handwave about this being due to the wide geographic distribution of mirrors, but I think that any download statistics would be better than none. Clearly, we can do both of these things with Hackage, and I think they'd be very useful (particularly the voting). Another small but useful thing that Hackage is missing is a notion of how fresh a package is. You have to hand-construct an URL to get a directory listing from Apache to find out how old a particular release a tarball is.

On Mon, 2007-11-19 at 21:49 -0800, Bryan O'Sullivan wrote:
Neil Mitchell wrote:
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
The problem is that only one person gets to comment on the quality of a library, the author, who is about the least objective person.
Not necessarily. CPAN has a nice voting system for packages, which is quite widely used.
Another useful proxy for quality that CPAN is missing is download statistics. The maintainers handwave about this being due to the wide geographic distribution of mirrors, but I think that any download statistics would be better than none.
I'd like to see hackage maintain download stats and have cabal-install report build success and failures (with build logs) along with basic config info like versions of deps, platform, compiler etc. This could make a great distributed testing system.
Clearly, we can do both of these things with Hackage, and I think they'd be very useful (particularly the voting). Another small but useful thing that Hackage is missing is a notion of how fresh a package is. You have to hand-construct an URL to get a directory listing from Apache to find out how old a particular release a tarball is.
Yes, I would like to see activity info for each package. What I'd really like to see is each package linking to it's darcs repo and generating an activity graph using dons's darcs-graph program. Though I'd also like to annotate the graph with marks for each release. Duncan

Some random thoughts triggered by this thread 1. I've been bowled over by the creativity unleashed by having a central site (Hackage), with a consistent installation story (Cabal), where you can upload packages with no central intervention. A single issue of the Haskell Weekly (sic) News with 60 library announcements represents a qualitative shift from the Haskell situation 2 years ago. That is fantastic. 2. We absolutely must not conflate GHC releases with QA-stamped library bundles. The latter would be great, but the two must be separate. (For reasons given by others in this thread.) 3. I think it'd be great if there were bundles of libraries that work together, are available on multiple platforms, and have had some QA testing. (Sounds as if releasing such bundles on a regular basis is the Gnome model.) Its not clear to me that any one is actually volunteering to lead such a thing though. 4. Meanwhile, we could get a lot more mileage from de-centralised approaches. Ideas I saw in this thread that sound attractive to me are to make Hackage display, for each package: - date of last update - download statistics - some kind of voting scores, so users can vote for good packages (and add text comments, please) - auto-build system, so that there's a per-platform indication of whether the package builds; ideally, packages should come with a test suite, which could be run too (Is this list complete?) These things (or some subset) look more feasible to me, because they can each be done with a finite effort, and then computers and library users will do the rest. Simon

On Wed, 2007-11-21 at 10:59 +0000, Simon Peyton-Jones wrote:
Some random thoughts triggered by this thread
1. I've been bowled over by the creativity unleashed by having a central site (Hackage), with a consistent installation story (Cabal), where you can upload packages with no central intervention. A single issue of the Haskell Weekly (sic) News with 60 library announcements represents a qualitative shift from the Haskell situation 2 years ago. That is fantastic.
Yes, it's been amazingly successful. Partly it's new libs, partly it's things that have been sitting around on various home pages or peoples local disks. Both are great of course.
2. We absolutely must not conflate GHC releases with QA-stamped library bundles. The latter would be great, but the two must be separate. (For reasons given by others in this thread.)
Yes or GHC HQ would go insane.
3. I think it'd be great if there were bundles of libraries that work together, are available on multiple platforms, and have had some QA testing. (Sounds as if releasing such bundles on a regular basis is the Gnome model.) Its not clear to me that any one is actually volunteering to lead such a thing though.
At the moment I think it'd be too much effort so we're not likely to get volunteers. However if we work on more hackage infrastructure I think it should be possible to reduce the effort required to the point where it'd be feasible. There is a separate discussion to be had about what kind of QA standards we might want.
4. Meanwhile, we could get a lot more mileage from de-centralised approaches. Ideas I saw in this thread that sound attractive to me are to make Hackage display, for each package: - date of last update - download statistics - some kind of voting scores, so users can vote for good packages (and add text comments, please) - auto-build system, so that there's a per-platform indication of whether the package builds; ideally, packages should come with a test suite, which could be run too
(Is this list complete?)
Those are the major things I think. We should file hackage feature requests for each of them. I'd also like to add links to darcs repos and possibly to bug trackers. These are simple extra fields in a .cabal file that hackage can create links for on the package page. There is also the stuff about letting hackage document api changes by comparing the apis of releases. This also relates to the package version policy. It may be implemented via haddock.
These things (or some subset) look more feasible to me, because they can each be done with a finite effort, and then computers and library users will do the rest.
Yes. I especially like the download stats and stats on development and release activity, that should be easy. For testing I'd like to see cabal-install report build success/failure and have summaries of that information presented on each packages hackage page. That's a slightly harder project. It should allow us to gather an enormous amount of information however so it's probably worth it. Duncan

Duncan Coutts
4. Meanwhile, we could get a lot more mileage from de-centralised approaches. Ideas I saw in this thread that sound attractive to me are to make Hackage display, for each package: - date of last update - download statistics - some kind of voting scores, so users can vote for good packages (and add text comments, please) - auto-build system, so that there's a per-platform indication of whether the package builds; ideally, packages should come with a test suite, which could be run too
(Is this list complete?)
Those are the major things I think.
No Google page rank-alike? I did a quick popularity count by wget'ting the whole thing, and looking for hrefs under cgi-bin/packages/archive¹. Not exact, as it counts links to the previous version, but a rough approximation. Page rank would be better, as it would ascribe higher importance to a library that is required by a more popular library. Anyway, quick and inaccurate results: 1 cabalrpmdeps 1 compression 1 dfsbuild 1 dfsbuild 1 EdisonAPI 1 exif-1 1 exif-3000 1 haxr-1 1 haxr-th-1 1 haxr-th-3000 1 hmarkup-1 1 hmarkup-3000 1 hscolour 1 hsql-mysql 1 hsql-odbc 1 hsql-postgresql-1 1 MonadRandom 1 packedstring 1 proplang 1 rss-1 1 rss-3000 1 Shellac-readline 1 vty 1 xslt 2 AGI 2 ALUT 2 anydbm 2 AppleScript 2 BerkeleyDB 2 BitSyntax 2 catch 2 chunks-2007 2 ContArrow 2 cpphs 2 csv 2 darcs-graph 2 debian 2 dsp 2 fastcgi-1 2 fastcgi-3000 2 ftphs 2 functorm 2 GLUT 2 GuiTV 2 harchive 2 hburg 2 HGL 2 hjs 2 hS3 2 HsHyperEstraier 2 HsSVN 2 hstats 2 IFS 2 infinity 2 IOSpec 2 libmpd 2 libxml 2 LRU 2 metaplug 2 monad-param 2 network-bytestring 2 NewBinary 2 parsedate 2 parsely 2 pointfree 2 ports 2 PostgreSQL 2 Ranged-sets 2 safecopy 2 selenium 2 Shellac 2 state 2 stream-fusion 2 strict 2 uniplate 2 Win32 2 X11-xft 3 alex 3 base64-string 3 fastcgi-3001 3 fgl 3 happy 3 haskelldb-hdbc 3 haxr-3000 3 HDBC-odbc 3 hsql-sqlite3 3 html 3 MaybeT 3 pqc 3 xhtml-1 4 Crypto-3 4 HDBC-postgresql 4 HPDF-0 4 HSH 4 HTTP 4 hxt 4 OpenAL 4 plugins 4 polyparse 4 readline 4 TypeCompose 5 gd 5 haskelldb-hsql 5 HTTP-Simple 5 iconv 5 pretty 6 Emping 6 encoding 6 GrowlNotify 6 hmp3 6 HsSyck 6 HTTP-3000 6 IndentParser 6 ipprint 6 logict 6 mime-string 6 monadLib 6 process 6 random 6 stringsearch 6 suffixtree 6 torrent-2007 7 cgi 7 cgi-3000 7 DeepArrow 7 HCL 7 stm 7 syb-with-class 7 tar 8 haskell-src 8 HDBC-sqlite3 8 hpodder-0 8 old-locale 8 TV 9 bencode 9 OpenGL 9 utf8-string 10 arrows 10 SDL 11 ConfigFile 11 dlist 11 hsql 12 bktrees 12 bzlib 12 directory 12 FileManip 12 Finance-Quote-Yahoo 12 haskelldb 12 hinstaller-2007 12 hpodder-1 12 hsns 12 hsSqlite3 12 numbers-2007 12 old-time 12 phooey-0 12 phooey-1 13 irc 15 cabal-upload 16 HPDF-1 17 hslogger 18 HaXml 18 HDBC 19 HUnit 20 HsOpenSSL 20 pandoc 20 regex-pcre 20 sessions-2007 20 xmonad 20 YamlReference 21 containers 21 Stream 21 unix-compat 22 template-haskell 23 zlib 24 array 24 X11-extras 25 gd-3000 26 MissingH 28 QuickCheck 28 xhtml-3000 31 time 32 Cabal 35 pcap 36 HTTP-3001 39 regex-posix 40 binary 40 regex-compat 41 X11 42 xmobar 43 bytestring 47 cgi-3001 52 regex-base 56 cabal-rpm 59 filepath 59 unix 101 haskell98 101 parsec 107 network 197 mtl -k -- If I haven't seen further, it is by standing in the footprints of giants

On Wed, 2007-11-21 at 14:57 +0100, Ketil Malde wrote:
No Google page rank-alike?
I did a quick popularity count by wget'ting the whole thing, and looking for hrefs under cgi-bin/packages/archive¹. Not exact, as it counts links to the previous version, but a rough approximation. Page rank would be better, as it would ascribe higher importance to a library that is required by a more popular library.
Anyway, quick and inaccurate results:
That's quite fascinating. Thanks. You've convinced me we should add something like that :-). Please file a feature request: http://hackage.haskell.org/trac/hackage/ Duncan

Duncan Coutts
I did a quick popularity count by wget'ting the whole thing, and looking for hrefs under cgi-bin/packages/archive¹.
That's quite fascinating. Thanks. You've convinced me we should add something like that :-).
Note that that was only a direct count, I haven't implemented a real "library rank".
Please file a feature request: http://hackage.haskell.org/trac/hackage/
This okay? http://hackage.haskell.org/trac/hackage/ticket/183 -k -- If I haven't seen further, it is by standing in the footprints of giants

On Nov 21, 2007 5:59 AM, Simon Peyton-Jones
2. We absolutely must not conflate GHC releases with QA-stamped library bundles. The latter would be great, but the two must be separate. (For reasons given by others in this thread.)
Someone in a previous thread made an analogy between GHC and the linux
kernel. I imagine that third-party Haskell distributions, consisting
of GHC/Hugs/whatever and some bundled packages, would meet the desire
for a "batteries included" Haskell implementation without tying the
most popular libraries to GHC releases.
--
Dave Menendez

"David Menendez"
Someone in a previous thread made an analogy between GHC and the linux kernel. I imagine that third-party Haskell distributions, consisting of GHC/Hugs/whatever and some bundled packages, would meet the desire for a "batteries included" Haskell implementation without tying the most popular libraries to GHC releases.
Well - the various Linux distributions certainly could do this - providing a virtual "haskell-libs" package that just pulls in a bunch of commonly used packages. It'd be nice, of course, if that package was reasonably consistent across distributions, and if there were a corresponding installer for those other operating systems. -k -- If I haven't seen further, it is by standing in the footprints of giants

Ketil Malde wrote:
"David Menendez"
writes: Someone in a previous thread made an analogy between GHC and the linux kernel. I imagine that third-party Haskell distributions, consisting of GHC/Hugs/whatever and some bundled packages, would meet the desire for a "batteries included" Haskell implementation without tying the most popular libraries to GHC releases.
Well - the various Linux distributions certainly could do this - providing a virtual "haskell-libs" package that just pulls in a bunch of commonly used packages. It'd be nice, of course, if that package was reasonably consistent across distributions, and if there were a corresponding installer for those other operating systems.
Meta-packages on hackage would do the trick, no? Regards, apfelmus

On Wed, Nov 21, 2007 at 10:59:21AM +0000, Simon Peyton-Jones wrote:
(Is this list complete?)
i would like to see some feedback (voting/scoring/message board) system for guaging interest in needed/missing/incomplete functionality my primary concern from the start of the thread was filling holes in the libraries

Simon Peyton-Jones wrote:
Some random thoughts triggered by this thread
1. I've been bowled over by the creativity unleashed by having a central site (Hackage), with a consistent installation story (Cabal), where you can upload packages with no central intervention. A single issue of the Haskell Weekly (sic) News with 60 library announcements represents a qualitative shift from the Haskell situation 2 years ago. That is fantastic.
I wasn't here 2 years ago. I'll take your word. :-)
2. We absolutely must not conflate GHC releases with QA-stamped library bundles. The latter would be great, but the two must be separate. (For reasons given by others in this thread.)
It's got my vote...
3. I think it'd be great if there were bundles of libraries that work together, are available on multiple platforms, and have had some QA testing. (Sounds as if releasing such bundles on a regular basis is the Gnome model.) Its not clear to me that any one is actually volunteering to lead such a thing though.
I would suggest that this depends on just how much work is going to be involved.
4. Meanwhile, we could get a lot more mileage from de-centralised approaches. Ideas I saw in this thread that sound attractive to me are to make Hackage display, for each package: - date of last update - download statistics - some kind of voting scores, so users can vote for good packages (and add text comments, please) - auto-build system, so that there's a per-platform indication of whether the package builds; ideally, packages should come with a test suite, which could be run too
(Is this list complete?) These things (or some subset) look more feasible to me, because they can each be done with a finite effort, and then computers and library users will do the rest.
I'm going to throw a few more in... - I see that HackageDB shows me "other versions" of each package ( = Good Thing). I don't see a changelog, or any way to easily determine what actually changed between versions. Am I being blind, or is this something we should think about adding? - Someone else already suggested this but... should Hackage host a bug tracker for individual packages too? (Would potentially make it easier to figure out where to post bugs in Random Package X.) - Linking to darcs? Actual darcs hosting? (Maybe make it even more trivial to press a button to say "package what I've got in darcs right now as version X".) Just some ideas.

andrewcoppin:
Hackage seems like a nice idea in principle. However,
I think in practice too: we had no central lib archive or dependency system, now we have 400 libraries, and a package installer, 10 months later. Until Hackage, there was a strong pressure not to reuse other people's libraries.
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
1. Welcome to the internet.
- Almost all packages seem to require a long list of dependencies.
2. Solved with cabal: cabal install foo Resolves package deps. Reusing libraries is a good thing.
- There seems to be an awful lot of packages that do the same thing but with incompatible interfaces (and varying limitations). It seems we're not very coordinated here.
See #1.
- (And, since I'm on Windows, I can't seem to get anything to install with Cabal...)
3. Report a bug. We need more developers on windows, for window to improve. -- Don

On Mon, 2007-11-19 at 12:17 -0800, Don Stewart wrote:
andrewcoppin:
Hackage seems like a nice idea in principle. However,
I think in practice too: we had no central lib archive or dependency system, now we have 400 libraries, and a package installer, 10 months later. Until Hackage, there was a strong pressure not to reuse other people's libraries.
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
1. Welcome to the internet.
Sure, but this doesn't mean we couldn't implement some mechanisms to improve the situation. Some things I could think of: - Have Hackage display a information of whether a package builds on a particular platform. Information could be provided by cabal-install (if the user agrees, of course.) - Allow uploaded packages receive minor patches, i.e., fixing .cabal file. This will probably be a recurring problem, since packages will be updated and base will be split up further. - I don't know if a commenting system on hackage would be more useful than on a package's homepage. At least it would be useful to have a package homepage and bug-tracker for each package. Both could simply be a code.google.com site. Alltogether, I'm quite happy with Hackage. There's room for improvement, sure, but I think we're on the right track ...

Don Stewart wrote:
andrewcoppin:
Hackage seems like a nice idea in principle. However,
I think in practice too: we had no central lib archive or dependency system, now we have 400 libraries, and a package installer, 10 months later.
Hackage is that new??
- The packages seem to be of quite variable quality. Some are excellent, some are rather poor (or just not maintained any more).
1. Welcome to the internet.
Well, yeah, I guess. ;-) As others have suggested, maybe a rating system or space for comments or something... [all very easy for the person who doesn't have to implement it.]
- (And, since I'm on Windows, I can't seem to get anything to install with Cabal...)
3. Report a bug. We need more developers on windows, for window to improve.
Where is the correct place for Cabal bugs? (And presumably "it doesn't work" wouldn't be a very helpful bug report. How about something more like my email just now regarding how to get the new Stream Fusion library built on Windows? Is that useful data to have?) Who's actually responsible for Cabal? Is it the GHC guys, or someone else entirely? (I still don't have a really clear idea of what Cabal does. When I give people Haskell code, they just compile it and use it. I'm not sure exactly what functionallity Cabal is supposed to add to the equation.)

On Mon, 2007-11-19 at 21:22 +0000, Andrew Coppin wrote:
Where is the correct place for Cabal bugs?
This and other questions are explained at .. *drumroll* .. the Cabal Homepage!! -- http://www.haskell.org/cabal/ :)

Hello Andrew, Monday, November 19, 2007, 10:47:49 PM, you wrote:
- (And, since I'm on Windows, I can't seem to get anything to install with Cabal...)
with ghc 6.4/6.6 and their built-in Cabal version, i never seen problems. sorry, can't say anything about 6.8 and new Cabal -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

At Mon, 19 Nov 2007 10:25:40 -0800, brad clawsie wrote:
so far the haskell community has taken the cpan route for most practical libs but i wonder if a "batteries included" approach might help get some key libraries to a more complete state. in particular, i would like to see support for basic internet protocols, database connectivity, and potentially xml parser support rolled into the ghc standard libs.
The recent trend has to to roll less into the ghc standard libs. There are a number of reasons for this. People complain it takes to long to build all of GHC, it makes it harder to port GHC to embedded platforms where many of the libraries are not needed or are hard to support. Libraries tend to only get updated as often as GHC (which is far to infrequent for many libraries). What if instead of rolling the libraries into GHC, they were rolled into task-specific bundles. For example, a bundle of internet libraries (http, xml, etc), a bundle of unix support libraries, etc. These bundles could be created and managed by third parties. The bundles might just be references to specific versions of packages already in hackage ? This would give you the power that comes from branding certain packages as recommended, but with (hopfully) less baggage than putting them in the GHC distribution itself? j.

On Mon, 19 Nov 2007, brad clawsie wrote:
i would categorize myself as a purely practical programmer. i enjoy using haskell for various practical tasks and it has served me reliably. one issue i have with the library support for practical problem domains is the half-finished state of many fundamental codebases such as networking and database support.
in the perl world, support for these domains is provided through cpan, and this model is viable due to the (once) massive number of perl coders out there. in the java, c# etc world, a "batteries included" approach implies a narrowing of options, but also an immediate delivery of functionality.
Although I liked the "battery included" approach of GHC so far, I install more and more interesting "additional" libraries, and GHC is shipped with libraries I have never used. A library being shipped with GHC has the aura of being standardized. However also these standard libraries changed, and I find the interface of some of them not satisfying. Since I expect that there will never be a consensus on the definition of "relevance" or even "right interface" I think that the Hackage approach of atomic libraries is the best solution for the future. It allows me to import libraries according to my needs, whereas today the decision is often directed by "what is standard?". With this solution we wouldn't have had the FiniteMap break, we could choose more equally between different data structure collections (say Edison vs. GHC libs), monad libraries, and so on.

whereas today the decision is often directed by "what is standard?". With this solution we wouldn't have had the FiniteMap break, we could choose more equally between different data structure collections (say Edison vs. GHC libs), monad libraries, and so on.
this is a good point..."blessing" one library can have a chilling impact on interesting alternatives. the exception i would make here is when you are coding to a known interface or protocol, in which case the matching the spec largely defines the coding exercise.

On Mon, 2007-11-19 at 10:25 -0800, brad clawsie wrote:
i would categorize myself as a purely practical programmer. i enjoy using haskell for various practical tasks and it has served me reliably. one issue i have with the library support for practical problem domains is the half-finished state of many fundamental codebases such as networking and database support.
So far I am pretty happy with the progress we've been making with hackage. It has massively increased the number of packages that are easily available. Most of our problems with it are down to it being successful so we now need more infrastructure to do searching and help users gauge stability, whether packages work in various circumstances etc. I think these mostly have technical solutions. That said, I think there is a place for a Haskell development platform. This should not be confused with GHC, though GHC obviously takes central place in our standard tool chain. Managing GHC releases has become increasingly difficult so we should continue the trend to reduce the size of GHC releases and not try to synchronise them with the release of every other part of our tool chain. I would like to compare this to the GNOME development platform. It has Gtk+ at it's hart but GNOME releases are not synchronised with Gtk+ releases. The GNOME development platform consists of a collection of standard packages. The collection is released on a time-based schedule, not a feature-based one. It puts a QA stamp on specific versions of its constituent packages that are known to work together. It has a procedure for getting packages included which include standards of API design and documentation. There is an infrastructure for maintaining, testing and releasing this platform. This is a model I think we should consider seriously. Duncan

I would like to compare this to the GNOME development platform. It has Gtk+ at it's hart but GNOME releases are not synchronised with Gtk+ releases. The GNOME development platform consists of a collection of standard packages. The collection is released on a time-based schedule, not a feature-based one. It puts a QA stamp on specific versions of its constituent packages that are known to work together. It has a procedure for getting packages included which include standards of API design and documentation. There is an infrastructure for maintaining, testing and releasing this platform.
This is a model I think we should consider seriously.
This sounds like something worth trying to me. I'm trying to think of libraries I would definitely see in such collection of libraries. bytestring jumps to mind immediately. A fast and stable HTTP package based on bytestring that supports developing web servers would be very nice too (I'll work more on this when I have time). The idea is basically that you implement a function of type: myApp :: Application where type Environ = Map ByteString ByteString type Headers = [(ByteString, ByteString)] type Application = Environ -> IO (Headers, ByteString) or something along those lines (i.e. a stream of user data is missing in the above example). The point is that I want to be able to do: import Network.WAI.Server (simpleServer) main = simpleServer myApp and have it just work. Sorry for the ramble. :) Cheers, Johan

Yes, those are good points. Maybe adding functionality similar to plt's planet http://planet.plt-scheme.org and http://download.plt-scheme.org/doc/371/html/mzscheme/mzscheme-Z-H-5.html#nod... In plt scheme including a module, not present in the local repository , but included via planet, resolves the module, including version, etc..., downloads it from planet, and uses it appropriately. It makes following various dependencies extremely easy. Updating with a new version is updating the appropriate local module definitions. I have no clue how it would be best to implement this for haskell, but it is a very user friendly no hassle way to work, so I reckon worth investigating. Cheers, Vlado

On Tue, Nov 20, 2007 at 12:33:21 +0000, Vladimir Zlatanov wrote:
Yes, those are good points. Maybe adding functionality similar to plt's planet http://planet.plt-scheme.org and http://download.plt-scheme.org/doc/371/html/mzscheme/mzscheme-Z-H-5.html#nod...
In plt scheme including a module, not present in the local repository , but included via planet, resolves the module, including version, etc..., downloads it from planet, and uses it appropriately. It makes following various dependencies extremely easy. Updating with a new version is updating the appropriate local module definitions.
I have no clue how it would be best to implement this for haskell, but it is a very user friendly no hassle way to work, so I reckon worth investigating.
Many other programming languages have packaging strategies that sound very similar. Several of them have managed to have a negative impact on platforms that already have good packaging technologies (i.e. almost every platform apart from Windows ;-). I'd hate to see Haskell go in a direction where packaging for e.g. Debian is made more difficult than it is at the moment. See [1] for the Debian Ruby packagers' opinion of RubyGems. IIRC similar concerns have been raised for Python's eggs. /M [1]: http://pkg-ruby-extras.alioth.debian.org/rubygems.html -- Magnus Therning (OpenPGP: 0xAB4DFBA4) magnus@therning.org Jabber: magnus.therning@gmail.com http://therning.org/magnus

On Wednesday 21 November 2007 20:14, Magnus Therning wrote:
Many other programming languages have packaging strategies that sound very similar. Several of them have managed to have a negative impact on platforms that already have good packaging technologies (i.e. almost every platform apart from Windows ;-). I'd hate to see Haskell go in a direction where packaging for e.g. Debian is made more difficult than it is at the moment.
Yes, OCaml has a custom package manager called GODI and debs for Debian, Ubuntu and Mac OS X. I don't think that works too well but trying to tell everyone to use apt doesn't seem to work either (especially at Microsoft). People have also suggested something like CPAN for Perl. Incidentally, I recently read of a harsh test suite someone used from CPAN that deleted his root filesystem when his software failed a test. Perhaps CPAN will facilitate the natural selection of Perl programmers next... :-) -- Dr Jon D Harrop, Flying Frog Consultancy Ltd. http://www.ffconsultancy.com/products/?e

Magnus Therning wrote:
On Tue, Nov 20, 2007 at 12:33:21 +0000, Vladimir Zlatanov wrote:
Yes, those are good points. Maybe adding functionality similar to plt's planet http://planet.plt-scheme.org and http://download.plt-scheme.org/doc/371/html/mzscheme/mzscheme-Z-H-5.html#nod...
In plt scheme including a module, not present in the local repository , but included via planet, resolves the module, including version, etc..., downloads it from planet, and uses it appropriately. It makes following various dependencies extremely easy. Updating with a new version is updating the appropriate local module definitions.
I have no clue how it would be best to implement this for haskell, but it is a very user friendly no hassle way to work, so I reckon worth investigating.
Many other programming languages have packaging strategies that sound very similar. Several of them have managed to have a negative impact on platforms that already have good packaging technologies (i.e. almost every platform apart from Windows ;-). I'd hate to see Haskell go in a direction where packaging for e.g. Debian is made more difficult than it is at the moment.
See [1] for the Debian Ruby packagers' opinion of RubyGems. IIRC similar concerns have been raised for Python's eggs.
/M
[1]: http://pkg-ruby-extras.alioth.debian.org/rubygems.html Much of that's either outdated or just plain wrong, as I understand it. In the interest of balance, note the following thread on ruby-talk, which devolved fairly rapidly into a bunfight over Debian's policies (and a comparison with Apple's approach to the same problem):
http://www.nabble.com/-ANN--RubyGems-0.9.5-tf4840470.html There are arguments on both sides, but the utility of having RubyGems available far outweighs the minor inconvenience of having to install RubyGems outside apt as far as I'm concerned. -- Alex

On Wed, Nov 21, 2007 at 08:14:09PM +0000, Magnus Therning wrote:
Many other programming languages have packaging strategies that sound very similar. Several of them have managed to have a negative impact on platforms that already have good packaging technologies (i.e. almost every platform apart from Windows ;-). I'd hate to see Haskell go in a direction where packaging for e.g. Debian is made more difficult than it is at the moment.
There's little danger of that, given the involvement of Duncan (a Gentoo developer) and Ian (a Debian developer).

<snip>
Many other programming languages have packaging strategies that sound very similar. Several of them have managed to have a negative impact on platforms that already have good packaging technologies (i.e. almost every platform apart from Windows ;-). I'd hate to see Haskell go in a direction where packaging for e.g. Debian is made more difficult than it is at the moment. :) That would be very bad indeed. With careful implementation that shouldn't be a problem.
Му reference to planet was regarding the ease of use and download of packages not installed in the system. That is integrated into the compiler via scheme macros or their equivalent in "non-scheme" languages: - just consider the following: (require (planet "eval.ss" ("dherman" "javascript.plt" 5 4))) if a package javascript verssion 5 4 by dherman is not present in your system or your user planet cache -> download, and compile it, then proceed doing your normal compiler duties.... With distribution provided packages they will be present if installed by root. With user downloaded they go somewhere in home or whatever else sensible place is pointed in the environment. I simply don't see a conflict. It is the normal unix way. Being lazy is good after all. Having the tools infer all dependencies and provide them to you when you need them is a good thing as well. Just compare it with the: sudo apt-get xyz; echo "I don't have root, I'll call dad, 'cause I want to compile" I'm not sure if it is currently possible to implement that via template haskell. From the snippets I've glimpsed it implements a simple defmacro like mechanism. But does it allow executing actions at compile time? If yes, it can be done in a library and then tested for usability, packaging and destruction. I do think it is wrong to have it in the prelude, at least for quite a while, but having the option to do it is a plus. I think I went oveboard with this. Sorry for the longish post.

Magnus Therning wrote:
Many other programming languages have packaging strategies that sound very similar. Several of them have managed to have a negative impact on platforms that already have good packaging technologies (i.e. almost every platform apart from Windows ;-). I'd hate to see Haskell go in a direction where packaging for e.g. Debian is made more difficult than it is at the moment.
I think that's a reasonable fear. In black moments, I share this fear with the Cabal Cabal. Fortunately, they assure me that they're well aware of this issue and don't intend it to become a problem. As someone in another reply points out, it's a considerable comfort that many of the CC are also package maintainers for other distributions. They know about this stuff :) Jules

Duncan Coutts wrote:
I would like to compare this to the GNOME development platform. It has Gtk+ at it's hart but GNOME releases are not synchronised with Gtk+ releases. The GNOME development platform consists of a collection of standard packages. The collection is released on a time-based schedule, not a feature-based one. It puts a QA stamp on specific versions of its constituent packages that are known to work together. It has a procedure for getting packages included which include standards of API design and documentation. There is an infrastructure for maintaining, testing and releasing this platform.
This is a model I think we should consider seriously.
Mmm, I like it... The only question would be "who gets to spend their time doing the QA job?" Perhaps we could automate it at least to some extent? Certainly this seems like a nice conceptual model to follow. We have the compiler, and we have a set of pluggable libraries. Sounds good to me...

Hello brad, Monday, November 19, 2007, 9:25:40 PM, you wrote:
practical projects. the "batteries included" approach does imply choosing preferred solutions when more than one library is available, this can also be difficult. that said, i think haskell would pick up a lot of new coders if it was obvious that the functionality they were looking for came out of the base libs.
it's interesting that i've proposed rather close thing to the Haskell' Standard committee. look at http://www.nabble.com/Standard-libraries-t4810359.html -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com
participants (33)
-
Alex Young
-
Andrew Coppin
-
apfelmus
-
Bayley, Alistair
-
Bjorn Bringert
-
brad clawsie
-
Brandon S. Allbery KF8NH
-
Bryan O'Sullivan
-
Bulat Ziganshin
-
David Menendez
-
david48
-
Derek Elkins
-
Don Stewart
-
Duncan Coutts
-
Henning Thielemann
-
Jeremy Shaw
-
Johan Tibell
-
Jon Harrop
-
Jules Bean
-
Justin Bailey
-
Keith Fahlgren
-
Ketil Malde
-
Krzysztof Kościuszkiewicz
-
Mads Lindstrøm
-
Magnus Therning
-
Mikhail Gusarov
-
Neil Mitchell
-
Radosław Grzanka
-
Ross Paterson
-
Simon Peyton-Jones
-
Thomas Hartman
-
Thomas Schilling
-
Vladimir Zlatanov