ANNOUNCE: Utrecht Haskell Compiler (UHC) -- first release

Utrecht Haskell Compiler -- first release, version 1.0.0 ======================================================== The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features plus many experimental extensions. The compiler runs on MacOSX, Windows (cygwin), and various Unix flavors. Features: * Multiple backends, including a bytecode interpreter backend and a GRIN based, full program analysing backend, both via C. * Experimental language extensions, some of which have not been implemented before. * Implementation via attribute grammars and other high-level tools. * Ease of experimentation with language variants, thanks to an aspect-oriented internal organisation. Getting started & Download -------------------------- UHC is available for download as source distribution via the UHC home page: http://www.cs.uu.nl/wiki/UHC Here you will also find instructions to get started. Status of the implementation ---------------------------- Like any university project UHC is very much work in progress. We feel that it is mature and stable enough to offer to the public, but much work still needs to be done; hence we welcome contributions by others. UHC grew out of our Haskell compiler project (called Essential Haskell Compiler, or EHC) over the past 5 years. UHC internally is organised as a combination of separate aspects, which makes UHC very suitable to experiment with; it is relatively easy to build compilers for sublanguages, or to generate related tools such as documentation generators, all from the same code base. Extensions to the language can be described separately, and be switched on or of as need arises. Warning ------- Although we think that the compiler is stable enough to compile subtantial Haskell programs, we do not recommend yet to use it for any serious development work in Haskell. We ourselves use the GHC as a development platform! We think however that it provides a great platform for experimenting with language implementations, language extensions, etc. Mailing lists ------------- For UHC users and developers respectively: http://mail.cs.uu.nl/mailman/listinfo/uhc-users http://mail.cs.uu.nl/mailman/listinfo/uhc-developers Bug reporting ------------- Please report bugs at: http://code.google.com/p/uhc/issues/list The UHC Team -- Atze Dijkstra, Department of Information and Computing Sciences. /|\ Utrecht University, PO Box 80089, 3508 TB Utrecht, Netherlands. / | \ Tel.: +31-30-2534118/1454 | WWW : http://www.cs.uu.nl/~atze . /--| \ Fax : +31-30-2513971 .... | Email: atze@cs.uu.nl ............ / |___\

On Sat, Apr 18, 2009 at 9:03 AM,
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features plus many experimental extensions. The compiler runs on MacOSX, Windows (cygwin), and various Unix flavors.
Features:
* Multiple backends, including a bytecode interpreter backend and a GRIN based, full program analysing backend, both via C.
* Experimental language extensions, some of which have not been implemented before.
* Implementation via attribute grammars and other high-level tools.
* Ease of experimentation with language variants, thanks to an aspect-oriented internal organisation.
Getting started & Download --------------------------
UHC is available for download as source distribution via the UHC home page:
Here you will also find instructions to get started.
Status of the implementation ----------------------------
Like any university project UHC is very much work in progress. We feel that it is mature and stable enough to offer to the public, but much work still needs to be done; hence we welcome contributions by others.
UHC grew out of our Haskell compiler project (called Essential Haskell Compiler, or EHC) over the past 5 years. UHC internally is organised as a combination of separate aspects, which makes UHC very suitable to experiment with; it is relatively easy to build compilers for sublanguages, or to generate related tools such as documentation generators, all from the same code base. Extensions to the language can be described separately, and be switched on or of as need arises.
Warning -------
Although we think that the compiler is stable enough to compile subtantial Haskell programs, we do not recommend yet to use it for any serious development work in Haskell. We ourselves use the GHC as a development platform! We think however that it provides a great platform for experimenting with language implementations, language extensions, etc.
Mailing lists -------------
For UHC users and developers respectively:
http://mail.cs.uu.nl/mailman/listinfo/uhc-users http://mail.cs.uu.nl/mailman/listinfo/uhc-developers
Bug reporting -------------
Please report bugs at:
http://code.google.com/p/uhc/issues/list
The UHC Team
-- Atze Dijkstra, Department of Information and Computing Sciences. /|\ Utrecht University, PO Box 80089, 3508 TB Utrecht, Netherlands. / | \ Tel.: +31-30-2534118/1454 | WWW : http://www.cs.uu.nl/~atze . /--| \ Fax : +31-30-2513971 .... | Email: atze@cs.uu.nl ............ / |___\
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
After running "./configure" on my Intel Mac (running OS 10.5.6 with GHC 6.10), I try to run "make uhc" and get the following:
$ make uhc src/ruler2/files.mk:34: build/ruler2/files-ag-d-dep.mk: No such file or directory src/ruler2/files.mk:35: build/ruler2/files-ag-s-dep.mk: No such file or directory mkdir -p build/shuffle ; \ --module=CDoc -dr -Psrc/shuffle/ -o build/shuffle/CDoc.hs src/shuffle/CDoc.ag /bin/sh: --module=CDoc: command not found make: Failed to remake makefile `build/ruler2/files-ag-s-dep.mk'. make: Failed to remake makefile `build/ruler2/files-ag-d-dep.mk'. make EHC_VARIANT=`echo install/101/bin/ehc | sed -n -e 's+install/\([0-9_]*\)/bin/ehc.*+\1+p'` ehc-variant src/ruler2/files.mk:34: build/ruler2/files-ag-d-dep.mk: No such file or directory src/ruler2/files.mk:35: build/ruler2/files-ag-s-dep.mk: No such file or directory mkdir -p build/shuffle ; \ --module=CDoc -dr -Psrc/shuffle/ -o build/shuffle/CDoc.hs src/shuffle/CDoc.ag /bin/sh: --module=CDoc: command not found make[1]: Failed to remake makefile `build/ruler2/files-ag-s-dep.mk'. make[1]: Failed to remake makefile `build/ruler2/files-ag-d-dep.mk'. make EHC_VARIANT_RULER_SEL="((101=HS)).(expr.base patexpr.base tyexpr.base decl.base).(e.int e.char e.var e.con e.str p.str)" \ ehc-variant-dflt src/ruler2/files.mk:34: build/ruler2/files-ag-d-dep.mk: No such file or directory src/ruler2/files.mk:35: build/ruler2/files-ag-s-dep.mk: No such file or directory mkdir -p build/shuffle ; \ --module=CDoc -dr -Psrc/shuffle/ -o build/shuffle/CDoc.hs src/shuffle/CDoc.ag /bin/sh: --module=CDoc: command not found make[2]: Failed to remake makefile `build/ruler2/files-ag-s-dep.mk'. make[2]: Failed to remake makefile `build/ruler2/files-ag-d-dep.mk'. make[1]: *** [ehc-variant] Error 2 make: *** [install/101/bin/ehc] Error 2 <<<<<
This is fairly bewildering. Am I the only one seeing errors like this? Thanks, Antoine Also:
$ make --version GNU Make 3.81 Copyright (C) 2006 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
This program built for powerpc-apple-darwin9.0 <<<<<
$ sh --version GNU bash, version 3.2.17(1)-release (i386-apple-darwin9.0) Copyright (C) 2005 Free Software Foundation, Inc. <<<<<

On 18 Apr 2009, at 22:44, Antoine Latter wrote:
On Sat, Apr 18, 2009 at 9:03 AM,
wrote: Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features plus many experimental extensions. The compiler runs on MacOSX, Windows (cygwin), and various Unix flavors.
Features:
* Multiple backends, including a bytecode interpreter backend and a GRIN based, full program analysing backend, both via C.
* Experimental language extensions, some of which have not been implemented before.
* Implementation via attribute grammars and other high-level tools.
* Ease of experimentation with language variants, thanks to an aspect-oriented internal organisation.
Getting started & Download --------------------------
UHC is available for download as source distribution via the UHC home page:
Here you will also find instructions to get started.
Status of the implementation ----------------------------
Like any university project UHC is very much work in progress. We feel that it is mature and stable enough to offer to the public, but much work still needs to be done; hence we welcome contributions by others.
UHC grew out of our Haskell compiler project (called Essential Haskell Compiler, or EHC) over the past 5 years. UHC internally is organised as a combination of separate aspects, which makes UHC very suitable to experiment with; it is relatively easy to build compilers for sublanguages, or to generate related tools such as documentation generators, all from the same code base. Extensions to the language can be described separately, and be switched on or of as need arises.
Warning -------
Although we think that the compiler is stable enough to compile subtantial Haskell programs, we do not recommend yet to use it for any serious development work in Haskell. We ourselves use the GHC as a development platform! We think however that it provides a great platform for experimenting with language implementations, language extensions, etc.
Mailing lists -------------
For UHC users and developers respectively:
http://mail.cs.uu.nl/mailman/listinfo/uhc-users http://mail.cs.uu.nl/mailman/listinfo/uhc-developers
Bug reporting -------------
Please report bugs at:
http://code.google.com/p/uhc/issues/list
The UHC Team
-- Atze Dijkstra, Department of Information and Computing Sciences. /|\ Utrecht University, PO Box 80089, 3508 TB Utrecht, Netherlands. / | \ Tel.: +31-30-2534118/1454 | WWW : http://www.cs.uu.nl/ ~atze . /--| \ Fax : +31-30-2513971 .... | Email: atze@cs.uu.nl ............ / | ___\
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
After running "./configure" on my Intel Mac (running OS 10.5.6 with GHC 6.10), I try to run "make uhc" and get the following:
$ make uhc src/ruler2/files.mk:34: build/ruler2/files-ag-d-dep.mk: No such file or directory src/ruler2/files.mk:35: build/ruler2/files-ag-s-dep.mk: No such file or directory mkdir -p build/shuffle ; \ --module=CDoc -dr -Psrc/shuffle/ -o build/shuffle/CDoc.hs src/ shuffle/CDoc.ag /bin/sh: --module=CDoc: command not found make: Failed to remake makefile `build/ruler2/files-ag-s-dep.mk'. make: Failed to remake makefile `build/ruler2/files-ag-d-dep.mk'. make EHC_VARIANT=`echo install/101/bin/ehc | sed -n -e 's+install/\([0-9_]*\)/bin/ehc.*+\1+p'` ehc-variant src/ruler2/files.mk:34: build/ruler2/files-ag-d-dep.mk: No such file or directory src/ruler2/files.mk:35: build/ruler2/files-ag-s-dep.mk: No such file or directory mkdir -p build/shuffle ; \ --module=CDoc -dr -Psrc/shuffle/ -o build/shuffle/CDoc.hs src/ shuffle/CDoc.ag /bin/sh: --module=CDoc: command not found make[1]: Failed to remake makefile `build/ruler2/files-ag-s-dep.mk'. make[1]: Failed to remake makefile `build/ruler2/files-ag-d-dep.mk'. make EHC_VARIANT_RULER_SEL="((101=HS)).(expr.base patexpr.base tyexpr.base decl.base).(e.int e.char e.var e.con e.str p.str)" \ ehc-variant-dflt src/ruler2/files.mk:34: build/ruler2/files-ag-d-dep.mk: No such file or directory src/ruler2/files.mk:35: build/ruler2/files-ag-s-dep.mk: No such file or directory mkdir -p build/shuffle ; \ --module=CDoc -dr -Psrc/shuffle/ -o build/shuffle/CDoc.hs src/ shuffle/CDoc.ag /bin/sh: --module=CDoc: command not found make[2]: Failed to remake makefile `build/ruler2/files-ag-s-dep.mk'. make[2]: Failed to remake makefile `build/ruler2/files-ag-d-dep.mk'. make[1]: *** [ehc-variant] Error 2 make: *** [install/101/bin/ehc] Error 2 <<<<<
This is fairly bewildering. Am I the only one seeing errors like this?
This looks like the same error I got – see bug report 1 in the bug database – the configure script reports that you have uuagc even if you don't – cabal install it, reconfigure, and you should be on your way. Second thing to watch for – it depends on fgl, but this isn't caught by the configure script. Thanks Bob

On Sat, Apr 18, 2009 at 4:38 PM, Thomas Davie
This looks like the same error I got – see bug report 1 in the bug database – the configure script reports that you have uuagc even if you don't – cabal install it, reconfigure, and you should be on your way.
Second thing to watch for – it depends on fgl, but this isn't caught by the configure script.
Apparently a "user" install of uuagc and fgl isn't good enough. Fun to know. Antoine

Antoine Latter
On Sat, Apr 18, 2009 at 4:38 PM, Thomas Davie
wrote: This looks like the same error I got ___ see bug report 1 in the bug database ___ the configure script reports that you have uuagc even if you don't ___ cabal install it, reconfigure, and you should be on your way.
Second thing to watch for ___ it depends on fgl, but this isn't caught by the configure script.
Apparently a "user" install of uuagc and fgl isn't good enough. Fun to know.
Do you have your $PATH set to include $HOME/.cabal/bin ? -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

On 19 Apr 2009, at 00:31, Antoine Latter wrote:
On Sat, Apr 18, 2009 at 4:38 PM, Thomas Davie
wrote: This looks like the same error I got – see bug report 1 in the bug database – the configure script reports that you have uuagc even if you don't – cabal install it, reconfigure, and you should be on your way.
Second thing to watch for – it depends on fgl, but this isn't caught by the configure script.
Apparently a "user" install of uuagc and fgl isn't good enough. Fun to know.
I've found user installs don't work at all on OS X, various people in #haskell were rather surprised to discover this, so apparently it's not the default behavior on other platforms. It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options! Bob

I've learnt (the hard way) not to trust user installs at all.
On Sun, Apr 19, 2009 at 12:41 AM, Thomas Davie
On 19 Apr 2009, at 00:31, Antoine Latter wrote:
On Sat, Apr 18, 2009 at 4:38 PM, Thomas Davie
wrote: This looks like the same error I got – see bug report 1 in the bug database – the configure script reports that you have uuagc even if you don't – cabal install it, reconfigure, and you should be on your way.
Second thing to watch for – it depends on fgl, but this isn't caught by the configure script.
Apparently a "user" install of uuagc and fgl isn't good enough. Fun to know.
I've found user installs don't work at all on OS X, various people in #haskell were rather surprised to discover this, so apparently it's not the default behavior on other platforms.
It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options!
Bob_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Thomas Davie wrote:
I've found user installs don't work at all on OS X, various people in #haskell were rather surprised to discover this, so apparently it's not the default behavior on other platforms.
It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options!
I'm on OS X and I've always used 'cabal install xyz' without any extra options, installing packages in my home directory instead of globally. I've never had any trouble with it yet. Martijn.

On Sun, 2009-04-19 at 00:41 +0200, Thomas Davie wrote:
Apparently a "user" install of uuagc and fgl isn't good enough. Fun to know.
I've found user installs don't work at all on OS X, various people in #haskell were rather surprised to discover this, so apparently it's not the default behavior on other platforms.
Currently, user installs are the default on all platforms except Windows.
It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options!
It should work fine, you'll need to give more details. Duncan

On 19 Apr 2009, at 09:52, Duncan Coutts wrote:
On Sun, 2009-04-19 at 00:41 +0200, Thomas Davie wrote:
Apparently a "user" install of uuagc and fgl isn't good enough. Fun to know.
I've found user installs don't work at all on OS X, various people in #haskell were rather surprised to discover this, so apparently it's not the default behavior on other platforms.
Currently, user installs are the default on all platforms except Windows.
It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options!
It should work fine, you'll need to give more details.
This has been the result, at least every time I've installed ghc: $ cabal install xyz $ runhaskell Setup.hs configure -- where abc depends on xyz Configuring abc-0.0... Setup.lhs: At least the following dependencies are missing: xyz -any $ sudo cabal install --global xyz $ runhaskell Setup.hs configure Configuring abc-0.0... $ runhaskell Setup.hs build ... Bob

On Sun, 2009-04-19 at 10:02 +0200, Thomas Davie wrote:
It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options!
It should work fine, you'll need to give more details.
This has been the result, at least every time I've installed ghc:
$ cabal install xyz
So this does a per-user install.
$ runhaskell Setup.hs configure -- where abc depends on xyz
This does a global install. Global packages cannot depend on user packages. You have two choices: $ cabal configure because the cabal program does --user installs by default or use $ runhaskell Setup.hs configure --user which explicitly does a --user install. The reason for this confusion is because the original runghc Setup interface started with global installs and we can't easily change that default. On the other hand, per-user installs are much more convenient so that's the sensible default for the 'cabal' command line program. Personally I just always use the 'cabal' program and never use the runghc Setup interface. There's almost never any need to use the runghc Setup interface. Duncan

On 19 Apr 2009, at 11:10, Duncan Coutts wrote:
On Sun, 2009-04-19 at 10:02 +0200, Thomas Davie wrote:
It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options!
It should work fine, you'll need to give more details.
This has been the result, at least every time I've installed ghc:
$ cabal install xyz
So this does a per-user install.
$ runhaskell Setup.hs configure -- where abc depends on xyz
This does a global install. Global packages cannot depend on user packages. You have two choices:
$ cabal configure
because the cabal program does --user installs by default or use
$ runhaskell Setup.hs configure --user
which explicitly does a --user install.
The reason for this confusion is because the original runghc Setup interface started with global installs and we can't easily change that default. On the other hand, per-user installs are much more convenient so that's the sensible default for the 'cabal' command line program.
I don't understand what makes user installs more convenient. Certainly, my preference would be for global all the time – I expect something that says it's going to "install" something to install it onto my computer, like any other installation program does. What is it that makes user installs more convenient in this situation? Bob

tom.davie:
On 19 Apr 2009, at 11:10, Duncan Coutts wrote:
On Sun, 2009-04-19 at 10:02 +0200, Thomas Davie wrote:
It really rather makes "cabal install" rather odd – because it doesn't actually install anything you can use without providing extra options!
It should work fine, you'll need to give more details.
This has been the result, at least every time I've installed ghc:
$ cabal install xyz
So this does a per-user install.
$ runhaskell Setup.hs configure -- where abc depends on xyz
This does a global install. Global packages cannot depend on user packages. You have two choices:
$ cabal configure
because the cabal program does --user installs by default or use
$ runhaskell Setup.hs configure --user
which explicitly does a --user install.
The reason for this confusion is because the original runghc Setup interface started with global installs and we can't easily change that default. On the other hand, per-user installs are much more convenient so that's the sensible default for the 'cabal' command line program.
I don't understand what makes user installs more convenient. Certainly, my preference would be for global all the time – I expect something that says it's going to "install" something to install it onto my computer, like any other installation program does. What is it that makes user installs more convenient in this situation?
You don't need 'sudo' access for user installs. This means that 'cabal install' works out of the box on every system, without needing admin/root privs (esp. important for students). -- Don

I don't understand what makes user installs more convenient. Certainly, my preference would be for global all the time – I expect something that says it's going to "install" something to install it onto my computer, like any other installation program does. What is it that makes user installs more convenient in this situation?
You don't need 'sudo' access for user installs. This means that 'cabal install' works out of the box on every system, without needing admin/root privs (esp. important for students).
But students will be used to needing to configure this – in every other installation system out there they need to tell it to install in their user directory. Personally – I find it more convenient to have the "install" program do what it says it does! Install it! This would save confusion about old tools that do things globally, and not confuse students, because they're all already used to giving extra flags to make install not install things system wide. Bob

Don Stewart
This means that 'cabal install' works out of the box on every system, without needing admin/root privs (esp. important for students).
...and people who were bitten by sanity and thus never, ever touch /usr manually, only through their distribution's package manager. Then there's those that work in an environment with network-mounted home directories, and even if the admins judged you responsible and skillful enough to award you local root rights, you still want your working environment to be available on any machine you log on. In short: If you don't need the package available for _every_ user, or your distribution comes with a pre-packaged version, don't even think to install it globally: it's an abomination to UNIX. Deal with it, OSX users aren't running a graphic shell on steroids, any more, and, yes, even Windoze users stopped running on top of that hacked-up program loader named DOS. -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

Achim Schneider wrote:
Don Stewart
wrote: This means that 'cabal install' works out of the box on every system, without needing admin/root privs (esp. important for students).
...and people who were bitten by sanity and thus never, ever touch /usr manually, only through their distribution's package manager.
This is good advice (/usr/local is fine though). However, the point here is surely that the de-facto default for all other downloaded programs - standard makefile setups, automake, autoconf, perl package, python packages, graphic installers like firefox - is do to what cabal calls a 'global' install by default. This makes cabal's inversion of default a violation of least surprise, however easy it may be to justify that user installs are better.

On 20 Apr 2009, at 10:27 pm, Jules Bean wrote:
This is good advice (/usr/local is fine though).
Actually, no, it isn't. To start with, these days it's chock full of stuff which is hardly less critical for system operation than anything you'll find in /bin. More importantly, it's not a place that many people *can* install stuff in. This University, for example, has a strict policy that NO-ONE (not even heads of department) other than designated system administrators may have 'adminstrator' access, which is required to install stuff in /usr/local. Either stuff can be installed easily in my own directories (which is why I have ~/local on all my machines), or I have to ask the system adminstrators to install it when they can get around to it (I've been waiting a couple of months already for one thing), or it doesn't get installed. As long as there are Windows machines in this organisation, that policy is _not_ going to change, not even for Linux or MacOS or Solaris.
However, the point here is surely that the de-facto default for all other downloaded programs - standard makefile setups, automake, autoconf, perl package, python packages, graphic installers like firefox - is do to what cabal calls a 'global' install by default.
The assumption here seems to be that everyone owns their own machine or has a system adminstrator with large amounts of free time on their hands. Just because a lot of other people are doing something crazy doesn't mean we have to copy them.
This makes cabal's inversion of default a violation of least surprise, however easy it may be to justify that user installs are better.
Sometimes it is _nice_ to be surprised. If someone out of the blue gave you a free car, would you turn it down because it violated the principle of least surprise? "Least surprise" is not as important as "do the right thing". On Macs and Windows boxes, I expect installers to _ask_ me where I want to put things. In any case, there are two easy ways to address the issue. 1. Have the installation tools *ask* the user whether they want the software installed in their own files or in /usr/local. 2. If you want to follow the blundering herd as much as you can, have the installation process start by trying to "install" an empty file in /usr/local, and if it can't do that, do the right thing instead.

Richard O'Keefe wrote:
On 20 Apr 2009, at 10:27 pm, Jules Bean wrote:
However, the point here is surely that the de-facto default for all other downloaded programs - standard makefile setups, automake, autoconf, perl package, python packages, graphic installers like firefox - is do to what cabal calls a 'global' install by default.
The assumption here seems to be that everyone owns their own machine or has a system adminstrator with large amounts of free time on their hands. Just because a lot of other people are doing something crazy doesn't mean we have to copy them.
The assumption is that you are running a multipurpose computer, i.e. your Haskell compiler isn't the only application you care about and you don't want it or other applications to interact badly together. The best way to do this is to have a package maintained by the distribution, and not have users or even sysadmins directly install it from source. Using non-standard installation methods makes it harder for package maintainers to package the application and suggests you haven't taken any care / don't care about making global installation safe. Edward

Edward Middleton
Richard O'Keefe wrote:
On 20 Apr 2009, at 10:27 pm, Jules Bean wrote:
However, the point here is surely that the de-facto default for all other downloaded programs - standard makefile setups, automake, autoconf, perl package, python packages, graphic installers like firefox - is do to what cabal calls a 'global' install by default.
The assumption here seems to be that everyone owns their own machine or has a system adminstrator with large amounts of free time on their hands. Just because a lot of other people are doing something crazy doesn't mean we have to copy them.
The assumption is that you are running a multipurpose computer, i.e. your Haskell compiler isn't the only application you care about and you don't want it or other applications to interact badly together. The best way to do this is to have a package maintained by the distribution, and not have users or even sysadmins directly install it from source. Using non-standard installation methods makes it harder for package maintainers to package the application and suggests you haven't taken any care / don't care about making global installation safe.
I second that. -- c/* __o/* <\ * (__ */\ <

On 21 Apr 2009, at 2:10 pm, Edward Middleton wrote:
Richard O'Keefe wrote:
The assumption here seems to be that everyone owns their own machine or has a system adminstrator with large amounts of free time on their hands. Just because a lot of other people are doing something crazy doesn't mean we have to copy them.
The assumption is that you are running a multipurpose computer, i.e. your Haskell compiler isn't the only application you care about and you don't want it or other applications to interact badly together.
That cannot be right. By that definition, I *do* run a multipurpose computer, in fact several of them. If I _weren't_ running multipurpose computers, I wouldn't be wanting GHC on them. On none of them is it possible for me to install in /usr/local. On no computer that is ever attached to the University's network is it possible for me or any other staff member (except the designated sysadmins) to do so, let alone students.
The best way to do this is to have a package maintained by the distribution, and not have users or even sysadmins directly install it from source.
Whether or not one installs from source and whether one installs in a system location or a user location are INDEPENDENT questions. Quintus, for example, provided binary releases that customers could install wherever they wanted. We were doing that 25 years ago. Installing-wherever-the-user-wants WITHOUT source has been doable for a long time, and is what I _expect_ in a Windows or Mac program. There are dishonourable exceptions. Recently I bought a French drill program for my elder daughter. At home I use a dual-boot (MacOS/Windows Vista) laptop provided by the department. On the Windows side, it asked me where I wanted it to go, and went there, no worries. On the MacOS side, - I was logged in as XXX - in order to install, I had to enter my "junior sysadmin" (adminXXX) credentials -- I don't think I was supposed to be given this much power, but I'd given the sysadmins such a long shopping list of applications I wanted on this "multipurpose computer" that they preferred me to do most of it - the programs were actually _installed_ as owned by 'admin' with the result that I cannot run them, or change their permissions so that I can run them, or even remove them. You can imagine how happy that kind of nonsense makes me. If they could get it right on Windows, how come they got it so egregiously wrong on MacOS? By assuming that everyone can become the ultimate superuser at will.
Using non-standard installation methods makes it harder for package maintainers to package the application and suggests you haven't taken any care / don't care about making global installation safe.
I'm sorry, there is no such animal as "safe" global installation, in the sense of "download this one package and do what it says." ghc 6.8.3 is /usr/bin/ghc on my office Mac, but nothing in the world prevents there being some other program called ghc that would also like to be there. Only by painstaking verification of a whole bunch of applications together can one be confident of "safety". My $PATH has to be maintained with some delicacy, as more than 1200 command names appear in more than two directories on it. There seem, for example, to be 4 programs called 'python'; two of them are links to 2.4, one of them is 2.3, and I can't tell what the other is. And don't go blaming me for this state of affairs, because they are _all_ system directories. Oh, and none of the directories holding python is any part of /usr/local. Since the version of gcc in /usr/local is 2.95.2, if I want a less antique version of gcc, /usr/local/bin isn't much use in my $PATH. Put ghc there, and what makes you think I will be able to run it? (The version of Amaya there is also hopelessly out of date, but there is nothing I can do about it.) Package maintainers _should_ find it _easier_ to provide reliably installable packages if they _don't_ take it for granted that they can put things in questionably "standard" places.

Richard O'Keefe wrote:
On 21 Apr 2009, at 2:10 pm, Edward Middleton wrote:
Using non-standard installation methods makes it harder for package maintainers to package the application and suggests you haven't taken any care / don't care about making global installation safe.
I'm sorry, there is no such animal as "safe" global installation, in the sense of "download this one package and do what it says."
Well I have been doing that for more then 10 years, it is one of the functions any decent package management systems is designed to do.
ghc 6.8.3 is /usr/bin/ghc on my office Mac, but nothing in the world prevents there being some other program called ghc that would also like to be there. Only by painstaking verification of a whole bunch of applications together can one be confident of "safety".
Well then I guess we agree, so the question becomes who should do the painstaking verification. I think distribution maintainers should do this, you think end users who can't compile source packages should do this. Edward

I'm sorry, there is no such animal as "safe" global installation, in the sense of "download this one package and do what it says."
Well I have been doing that for more then 10 years, it is one of the functions any decent package management systems is designed to do.
Time to adopt another goodie from Utrecht? http://nixos.org/ Cheers, Stefan

Edward Middleton
ghc 6.8.3 is /usr/bin/ghc on my office Mac, but nothing in the world prevents there being some other program called ghc that would also like to be there. Only by painstaking verification of a whole bunch of applications together can one be confident of "safety".
Well then I guess we agree, so the question becomes who should do the painstaking verification. I think distribution maintainers should do this, you think end users who can't compile source packages should do this.
Not the maintainers, but the tool. Portage doesn't install stuff if it would overwrite other things, records changes to files in e.g. /etc to be merged later (interactively, with diffs), and records every file it ever installed by having the package install itself in /var/portage/<package>/<version>. You are _completely_busted_ if your install script doesn't support that: The script runs sandboxed. Portage even registers every installed package into an empty ghc package database, and merges them later. It knows what it does. I can switch between different versions of packages, or different implementations of the same functionality (say, java-sun vs. java-blackdown) with eselect. In short: Don't write your own install scripts, you're bound to get it wrong, and/or be vastly inferior, compared to portage. -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

Achim Schneider wrote:
Edward Middleton
wrote: ghc 6.8.3 is /usr/bin/ghc on my office Mac, but nothing in the world prevents there being some other program called ghc that would also like to be there. Only by painstaking verification of a whole bunch of applications together can one be confident of "safety".
Well then I guess we agree, so the question becomes who should do the painstaking verification. I think distribution maintainers should do this, you think end users who can't compile source packages should do this.
Not the maintainers, but the tool. Portage doesn't install stuff if it would overwrite other things, records changes to files in e.g. /etc to be merged later (interactively, with diffs), and records every file it ever installed by having the package install itself in /var/portage/<package>/<version>. You are _completely_busted_ if your install script doesn't support that: The script runs sandboxed.
Portage even registers every installed package into an empty ghc package database, and merges them later. It knows what it does.
I can switch between different versions of packages, or different implementations of the same functionality (say, java-sun vs. java-blackdown) with eselect.
In short: Don't write your own install scripts, you're bound to get it wrong, and/or be vastly inferior, compared to portage.
But who writes the ebuild[1] ? That said, on the various system I run I have over 100 custom ebuilds that I maintain. I can do this because most applications have standard sane build systems that install things in the regular places. Edward 1. http://en.wikipedia.org/wiki/Ebuild

Edward Middleton
Achim Schneider wrote:
Edward Middleton
wrote: ghc 6.8.3 is /usr/bin/ghc on my office Mac, but nothing in the world prevents there being some other program called ghc that would also like to be there. Only by painstaking verification of a whole bunch of applications together can one be confident of "safety". Well then I guess we agree, so the question becomes who should do the painstaking verification. I think distribution maintainers should do this, you think end users who can't compile source packages should do this.
Not the maintainers, but the tool. Portage doesn't install stuff if it would overwrite other things, records changes to files in e.g. /etc to be merged later (interactively, with diffs), and records every file it ever installed by having the package install itself in /var/portage/<package>/<version>. You are _completely_busted_ if your install script doesn't support that: The script runs sandboxed.
Portage even registers every installed package into an empty ghc package database, and merges them later. It knows what it does.
I can switch between different versions of packages, or different implementations of the same functionality (say, java-sun vs. java-blackdown) with eselect.
In short: Don't write your own install scripts, you're bound to get it wrong, and/or be vastly inferior, compared to portage.
But who writes the ebuild[1] ? That said, on the various system I run I have over 100 custom ebuilds that I maintain. I can do this because most applications have standard sane build systems that install things in the regular places.
Yes, usually it's just a matter of writing "inherit distutils" and figuring out (still missing) dependencies. While I utterly loathe autoconf, it has its merit... being widely used. -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

Maybe it has gone unnoticed, but the main reason we made the compiler available, was to make it possible for others to experiment with its type extensions, its Grin based back-end and to show the advantages (and disadvantages?) of generating large part of the compiler from an attribute grammar based descriptions. If we had been interested in raising fierce discussions about n+k patterns or how and where cabal installs things, we could have easily achieved the same effect with much less effort. Doaitse Swierstra

Hello S., Tuesday, April 21, 2009, 5:42:15 PM, you wrote:
If we had been interested in raising fierce discussions about n+k patterns or how and where cabal installs things, we could have easily achieved the same effect with much less effort.
you mean that we should shoot up? :) -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

If we had been interested in raising fierce discussions about n+k patterns or how and where cabal installs things, we could have easily achieved the same effect with much less effort.
you mean that we should shoot up? :)
If the release of UHC contributes to whatever discussion regarding Haskell, that's of course, in its own right, a Good Thing---as long as the discussion turns out to be a fruitful one and doesn't end up in a religious war. (I'm by no means claiming that the current ones on (n + k)-patterns and cabal are!) However, that's of course not our main motivation for releasing UHC. We ourselves find the EHC infrastructure very useful for experimentation with type systems and back ends. Owing a great deal to the community, we hope that by releasing the infrastructure in the form of a not so much mature but at least maturing Haskell compiler we can give something back to the community and have it profit from the contained technology. Therefore, it would be a shame if UHC was only to be associated with debating language features and build systems; I'm confident, however, that it won't. Cheers, Stefan

I think the only way your release is going to get significant feedback
is when it's ready to compile substantial existing Haskell programs
unaltered.
I might try UHC on some toy example for a few minuts, but if it falls
over when I give it code that I've already written I'll soon give up
using it.
I don't know what state UHC is in rigth now, because it doesn't
install with cabal and I'm too lazy to install by hand.
I think the bar for entry into the Haskell compiler market is pretty
high these days. Not only do you need to support (98% of) Haskell 98,
but you also need to support the most commonly used extensions.
Still, I think it's great to see another Haskell compiler!
-- Lennart
On Tue, Apr 21, 2009 at 4:23 PM, Stefan Holdermans
If we had been interested in raising fierce discussions about n+k patterns or how and where cabal installs things, we could have easily achieved the same effect with much less effort.
you mean that we should shoot up? :)
If the release of UHC contributes to whatever discussion regarding Haskell, that's of course, in its own right, a Good Thing---as long as the discussion turns out to be a fruitful one and doesn't end up in a religious war. (I'm by no means claiming that the current ones on (n + k)-patterns and cabal are!)
However, that's of course not our main motivation for releasing UHC. We ourselves find the EHC infrastructure very useful for experimentation with type systems and back ends. Owing a great deal to the community, we hope that by releasing the infrastructure in the form of a not so much mature but at least maturing Haskell compiler we can give something back to the community and have it profit from the contained technology. Therefore, it would be a shame if UHC was only to be associated with debating language features and build systems; I'm confident, however, that it won't.
Cheers,
Stefan _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On 21 Apr 2009, at 8:20 pm, Edward Middleton wrote:
ghc 6.8.3 is /usr/bin/ghc on my office Mac, but nothing in the world prevents there being some other program called ghc that would also like to be there. Only by painstaking verification of a whole bunch of applications together can one be confident of "safety".
Well then I guess we agree, so the question becomes who should do the painstaking verification. I think distribution maintainers should do this, you think end users who can't compile source packages should do this.
You cannot be serious. Come ON people, let's have some honest argument here. 1. There are people who CAN'T install stuff in system areas. There seems to be no reason why they should not use Haskell. 2. The claimed advantage of putting things in "a standard place" is unreal. Or rather, it's real to the extent that you can assume that all the world is Linux. 3. There is no guaranteed safe *SYSTEM* place to put things. 4. A user can, however, readily verify that THEY don't have or don't use another program by the same name, regardless of what some other user on the same machine might have or use. 5. If something installs in user space, the user can get rid of it, or if not, at least only that user is harmed. My departmental laptop has just had to be wiped and reinstalled because a commercial program that installed things in a "standard" place stuffed up. It is absurd to allege that I think end users who cannot compile source packages should verify a collection of oodles of packages. I don't, and I never wrote anything that should lead anyone to suppose so. I'm not sure _anyone_ can. Installing stuff in system areas, whether it is "standard" or "default" to do so or not, is DANGEROUS. Installing stuff in user areas is LESS dangerous. All installs are dangerous. I don't say user installs are safe. Just that they are LESS dangerous. In particular, a user install will not stuff things up for all the users on a machine. There are four people in my immediate family. We all have accounts on the machines I own. Even though I own those machines and DO have superuser access, I DON'T want to risk stuffing up the entire system. Installing stuff under my own account gives the others some protection.

Richard O'Keefe wrote:
However, the point here is surely that the de-facto default for all other downloaded programs - standard makefile setups, automake, autoconf, perl package, python packages, graphic installers like firefox - is do to what cabal calls a 'global' install by default.
The assumption here seems to be that everyone owns their own machine or has a system adminstrator with large amounts of free time on their hands. Just because a lot of other people are doing something crazy doesn't mean we have to copy them.
No. It's not an assumption, it's a default. No one is assuming anything. Both options are available. The point I was making, which is scarcely important enough to bother explaining again, is that having the same *default* as other software is a virtue. In point of fact, I'm sure that a larger proportion of haskell users have their own machine than don't. Jules

On 21 Apr 2009, at 4:52 pm, Jules Bean wrote:
The point I was making, which is scarcely important enough to bother explaining again, is that having the same *default* as other software is a virtue.
That point is mistaken. I have no idea how many people are unable to use that default, but there are lots of people at this University in my situation. A default which means we can't install stuff is a default that I cannot regard with happiness, and which I cannot comprehend anyone contemplating with complacency. One of the suggestions I have made is that an installer could/should investigate whether it *can* install in the "standard" place (since ghc on my Mac is in /bin, not /usr/local/bin, and since I certainly didn't put it there, I wonder just how standard a standard place is), and if it *can't*, instead of failing miserably, it should *out of the box* *without recompiling from sources* let the user put it wherever it needs to go. This is compatible with the "default" on all systems where the default would actually _work_, while being _useful_ on systems where it wouldn't. It shouldn't be necessary to point out that the people least able to install in /usr/local are by and large going to be the people least able to build from sources, so saying "build from sources if you can't install in a standard place" would not be user-friendly.
In point of fact, I'm sure that a larger proportion of haskell users have their own machine than don't.
That's the wrong question. In addition to the several machines in my office, and the departmental laptop, I _do_ have several machines of my own. But the mere fact of possessing my own machines does NOT mean that I am able to install stuff on the machines I spend most of my time using. Some of the right questions are - how many potential <whatever> users would need to have <whatever> installed on _some_ machine they do NOT have administrator access to? - if people find Mac and Windows installers that show you where something is going to be put and offer you the chance to change it acceptable, why exactly would that be unacceptable under Linux or Solaris? - since we know install-anywhere binary releases are possible, and since we know that an installer _can_ probe to see whether installation in /usr/local (or any other "standard" place) is possible, why not do it?

"Richard O'Keefe"
Some of the right questions are - how many potential <whatever> users would need to have <whatever> installed on _some_ machine they do NOT have administrator access to?
Irrelevant.
- if people find Mac and Windows installers that show you where something is going to be put and offer you the chance to change it acceptable, why exactly would that be unacceptable under Linux or Solaris?
It's perfectly acceptable, even required, but, for the love of UNIX, take that path as a parameter, don't do a GUI. If you want a GUI, write it in terms of that script.
- since we know install-anywhere binary releases are possible, and since we know that an installer _can_ probe to see whether installation in /usr/local (or any other "standard" place) is possible, why not do it?
I really, really don't like the idea of a program behaving differently based on the permissions it has, short of failing to do what I told it to do. OTOH, quickly checking whether the user has write permissions to / and failing with "you need root right to do that, did you mean to call this script with --user?" instead of failing with access denied errors is a Good Thing.[1] Echoing "binaries were installed in $HOME/.cabal/bin", and checking the user's $PATH and displaying a warning if that directory isn't in it is a Good Thing, too. I guess it's also the main problem those not literate in UNIX have with cabal. [1] Does install --user check whether configure was called with --user, too? I hope so... -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

On 21 Apr 2009, at 11:36 pm, Achim Schneider wrote:
"Richard O'Keefe"
wrote: Some of the right questions are - how many potential <whatever> users would need to have <whatever> installed on _some_ machine they do NOT have administrator access to?
Irrelevant.
How van the question that is the very heart of this thread be "irrelevant"? This is precisely the situation I'm in, and it's precisely the class of users I'm arguing for. I'm encouraged by the constructive suggestions of package tools (nix, portage) that are said to address some of these issues. Except of course that I have to install them first...
OTOH, quickly checking whether the user has write permissions to / and failing with "you need root right to do that, did you mean to call this script with --user?" instead of failing with access denied errors is a Good Thing.[1]
I think we can agree on this.
Echoing "binaries were installed in $HOME/.cabal/bin", and checking the user's $PATH and displaying a warning if that directory isn't in it is a Good Thing, too. I guess it's also the main problem those not literate in UNIX have with cabal.
I think we can agree on this too.

"Richard O'Keefe"
On 21 Apr 2009, at 11:36 pm, Achim Schneider wrote:
"Richard O'Keefe"
wrote: Some of the right questions are - how many potential <whatever> users would need to have <whatever> installed on _some_ machine they do NOT have administrator access to?
Irrelevant.
How van the question that is the very heart of this thread be "irrelevant"?
This is precisely the situation I'm in, and it's precisely the class of users I'm arguing for.
I'm encouraged by the constructive suggestions of package tools (nix, portage) that are said to address some of these issues. Except of course that I have to install them first...
It's irrelevant, because I _do_ have root access to my machine, but don't want to get forced into using it by a question that implies that if you have access, you're going to use it. I didn't mean to nit pick, though, I thought you were arguing for the other side... I think the right question is "how many people prefer user installs over system installs, wrt. their hackage packages?". I estimate that, concerning developers, who are used to install still-buggy, self-written libraries, as well as install things while working, the percentage is very, very high: At least I don't want my workflow to be broken to deal with the formal requirements of a global install while developing, and I guess many others feel the same way.[1] Endusers, of course, might have other preferences, but cabal doesn't (IMHO) cater to them, directly: It caters to distribution packages (or windows installers, or whatever), so cabal's default behaviour is quite irrelevant for those cases. [1] Thinking of it... is there a way to tell cabal to pretend a package is installed by giving the path to it's source directory? Just like include directories, but with packages. -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

It's irrelevant, because I _do_ have root access to my machine,
How nice to be you. Since the argument is entirely about people who _don't_, your point it? It is clear that the only sensible default is no default. Someone else has said it recently and said it much better.
I think the right question is "how many people prefer user installs over system installs, wrt. their hackage packages?".
No, because the costs are asymmetric.
Endusers, of course, might have other preferences, but cabal doesn't (IMHO) cater to them, directly: It caters to distribution packages (or windows installers, or whatever), so cabal's default behaviour is quite irrelevant for those cases.
The clear impression I've received on this mailing list is that cabal is _also_ for people who are using Haskell and find that there's a new package reported in HWN that they'd like to have. If you are now telling me that I should ignore it because I'm not making distribution packages or Windows installers, fine, just tell me what to use instead.

Richard O'Keefe wrote:
Endusers, of course, might have other preferences, but cabal doesn't (IMHO) cater to them, directly: It caters to distribution packages (or windows installers, or whatever), so cabal's default behaviour is quite irrelevant for those cases.
The clear impression I've received on this mailing list is that cabal is _also_ for people who are using Haskell and find that there's a new package reported in HWN that they'd like to have. If you are now telling me that I should ignore it because I'm not making distribution packages or Windows installers, fine, just tell me what to use instead.
distribution packages. Edward

On 23 Apr 2009, at 5:17 pm, Edward Middleton wrote:
Richard O'Keefe wrote:
The clear impression I've received on this mailing list is that cabal is _also_ for people who are using Haskell and find that there's a new package reported in HWN that they'd like to have. If you are now telling me that I should ignore it because I'm not making distribution packages or Windows installers, fine, just tell me what to use instead.
distribution packages.
OK, please tell me where I can get a distribution package with GHC 6.10 and a *full* set of what's in Hackage for intel Mac OS X 10.5.5 SPARC (Ultra II) Solaris 2.10 that I can use without superuser access. Oh look, http://haskell.org/ghc/distribution_packages.html Mac OS X is there, but not for 10.5. It requires MacPorts, which I've had no luck with. Solaris isn't there at all. At http://haskell.org/ghc/download_ghc_6_10_2.html we find that the .tar.bz2 binary for Mac OS X 10.5 "needs libedit.2.dylib, libncurses.5.dylib and libgmp.3.dylib under /opt/local/lib/." Sadly, my /opt/local/lib is empty, and there is nothing I can do about it. (libncurses.5.dylib and libedit.2.dylib are in /usr/lib, which is apparently the place that Apple put them. Why expect to find them in /opt/local/lib? Unpack, try it, nope, it only complains about libgmp.) I'm glad it lets me install ghc wherever I want, if only it let me install the prequisites the same way. I have a colleague who argues against free software every chance he gets. The high cost of installing and maintaining it is one of his main arguments. where

[Moved from the UHC thread – lets stop treading on those guys toes, they did something very very shiny] On 23 Apr 2009, at 07:02, Richard O'Keefe wrote:
It's irrelevant, because I _do_ have root access to my machine,
How nice to be you. Since the argument is entirely about people who _don't_, your point it?
His point is that that kind of person is not the only kind of person, so to base an argument on what they want is as weak as basing an argument on what he wants.
It is clear that the only sensible default is no default.
That sounds pretty sensible to me too – much like darcs asks what your email address is the first time you work on a repository, cabal should probably ask the first time you run it "do you prefer global or user installs?"
I think the right question is "how many people prefer user installs over system installs, wrt. their hackage packages?".
No, because the costs are asymmetric.
I think this is a case of not seeing the costs to the other users because you're firmly entrenched in your camp. I would have said originally that the costs are asymmetric too – but that it's a much greater cost for the people who expect all installers to do global installs. So I think that the question asked there is a very valid one. However, I do like the solution of not giving any default. Bob

On 23 Apr 2009, at 7:39 pm, Thomas Davie wrote:
His point is that that kind of person is not the only kind of person, so to base an argument on what they want is as weak as basing an argument on what he wants.
But that is PRECISELY what I am arguing. I'm arguing that people-who-have-superuser-access are not the only kind of person, so that basing decisions on what is convenient for them ONLY is wrong. I have never thought, stated, or implied, that only people without superuser access count! It's just that I and for that matter, the sysadmin I talk to most are heartily fed up with the assumption that everyone is a sysadmin. I note that someone mentioned the Nix package manager in this thread. Reading the comments on that article opened my eyes: /usr/local is far less of a "standard" installation directory than I had ever supposed. So it is clear that people who DO have superuser access and DO want to install stuff in "system" areas will quite often need to put it elsewhere than /usr/local. (Case in point: the GHC 6.10.2 binary release for MacOS X expecting to find stuff installed in /opt/local/lib .)
No, because the costs are asymmetric.
I think this is a case of not seeing the costs to the other users because you're firmly entrenched in your camp.
You are mistaken. The cost to users who expect global installs is *tiny*: run cabal without ever saying where you want things to go, and you get a *one time* prompt telling you that you have to set that up. The cool thing about this is that users who DO want global installs but want them in /opt/local or /opt/GHC instead of /usr/local pay the same one time price, and users who want an installation in directories they control is the same one time setup. That's the cost I would impose. It's tiny. The present system imposes very high costs to people who cannot do 'global' installs at all or who want /opt/something instead of /usr/local. By the way, the term 'global' is unfortunate. It is possible for a site to set up a "Haskell aministrator" who has full access to a publicly readable set of directories but who neither has nor needs any kind of superuser access. That's pretty much the setup I have on my SPARC: there's a /users/local directory that I have complete control over, but it's visible to anyone who wants it. By me, that's as "global" as anyone needs, but it's NOT in system space.
I would have said originally that the costs are asymmetric too – but that it's a much greater cost for the people who expect all installers to do global installs.
Can you please explain this a little more? Why is being prompted for a location "a much greater cost" than (for example) not being able to install at all, or having to rebuild from sources? While we're at it, I finally tracked down why ghc 6.6 didn't work on my SPARC. At least I think I did. There is a difference between "there is a version of gcc that I can see and run and compile simple test files with" and "there is a version of gcc that understands all the version X.Y command line arguments.".

"Richard O'Keefe"
I have never thought, stated, or implied, that only people without superuser access count! It's just that I and for that matter, the sysadmin I talk to most are heartily fed up with the assumption that everyone is a sysadmin.
Yes, but I thought you implied exactly that, because I thought you were asking "Are the ten people not having root access even worth the bother?". I didn't _mean_ to imply it, either, I _assumed_ you did, because, frankly, I would have written the same question in a way that wouldn't have me made that assumption. Not looking at your name before replying did the rest. Truth is a three-edged sword, and misunderstandings are great fun. -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

For what it's worth, It's bothered me often enough that cabal doesn't install globally by default that I had to reinstall ghc in order to solve package issues. So I'd prefer the default to be global. But I don't care that much, I don't think arguing that point is leading anywhere.

On Tue, 2009-04-21 at 12:31 +0200, david48 wrote:
For what it's worth, It's bothered me often enough that cabal doesn't install globally by default that I had to reinstall ghc in order to solve package issues.
Do you know what the problem was exactly? It's possible to get problems with overlap between the user and global package dbs, but the exact same problems can also happen just within the global package db.
So I'd prefer the default to be global.
You can make that the default in the config file. Duncan

On Wed, Apr 22, 2009 at 12:32 AM, Duncan Coutts wrote: On Tue, 2009-04-21 at 12:31 +0200, david48 wrote: For what it's worth, It's bothered me often enough that cabal doesn't
install globally by default that I had to reinstall ghc in order to
solve package issues. Do you know what the problem was exactly? It's possible to get problems
with overlap between the user and global package dbs, but the exact same
problems can also happen just within the global package db. One problem I had was while installing Lehksah. ( It was you who pointed me
to the solution, thanks. )
The last problem was for installing wxhaskell from the source. The first
part compiled just fine, but the second wouldn't with a package problem. I
didn't want to bother searching what the problem was, so I thought it was
faster to reinstall ghc and compile wxhaskell then. (it worked)
Also, I think it's been a while I managed to do a cabal upgrade which didn't
stop on a dependency issue.
Since I didn't write down the exact problems I had, I'm attempting a fresh
install, and I'll write down what happens as I go.
1) Installing GHC 6.10.2 from the tarball, I decided to give it a try to
./configure --prefix="/home/david/local"
2) Adding /home/david/local' to my PATH
3) I find a binary for cabal-install 0.6.0,
4) cabal update
5) cabal install cabal-install
Proceeds to download and compile HTTP-4000.0.6, then zlib-0.5.0.0 which
fails because I don't have zlib.h on this new system.
david@pcdavid2:~$ sudo apt-get install
... well there is no zlib-dev, libzlib-dev available on Jaunty. there is a
zlib1-dev which fails to install, and a zlib1g-dev which works.
david@pcdavid2:~$ cabal-0.6.0 install cabal-install again.
This time zlib-0.5.0.0 compiles, but then :
/usr/bin/ld: cannot find -lgmp
david@pcdavid2:~$ sudo apt-get install libgmp3-dev
david@pcdavid2:~$ download/cabal-0.6.0 install cabal-install againagain.
This time all goes well except that:
Installing executable(s) in /home/david/.cabal/bin
why the hell would cabal install binaries in a subdirectory of a hidden
directory. Why not /home/david/bin or /home/david/local/bin ?
Ok so I find out the setting to change in .cabal/configure, but there's
already two packages installed and downloaded there, and I don't know how to
change them to the correct location.
So, deleting .cabal and local, reinstalling ghc. I kept the de-tarred
directory around, so it's really quick.
cabal-update again, make sure config has the right path.
oops. Cabal thinks zlib is still around, I thought I had deleted that.
Looks like reinstalling ghc didn't rewrite my package list. removing .ghc
and trying again.
Now something else.
david@pcdavid2:~$ ghc-pkg check
There are problems in package rts-1.0:
include-dirs: PAPI_INCLUDE_DIR doesn't exist or isn't a directory
The following packages are broken, either because they have a problem
listed above, or because they depend on a broken package.
rts-1.0
haddock-2.4.2
ghc-prim-0.1.0.0
integer-0.1.0.1
base-4.1.0.0
...
Solution :
david@pcdavid2:~$ ghc-pkg describe rts | sed 's/PAPI_INCLUDE_DIR//' |
ghc-pkg update -
david@pcdavid2:~$ download/cabal-0.6.0 install cabal-install
Linking dist/build/cabal/cabal ...
Installing executable(s) in /home/david/.cabal/bin
WTF?
david@pcdavid2:~$ vi .cabal/config
install-dirs user
-- prefix: /home/david/local
-- bindir: $prefix/bin
-- libdir: $prefix/lib
I give up for now.

On Wed, 2009-04-22 at 12:21 +0200, david48 wrote:
Do you know what the problem was exactly? It's possible to get problems with overlap between the user and global package dbs, but the exact same problems can also happen just within the global package db.
One problem I had was while installing Lehksah. ( It was you who pointed me to the solution, thanks. ) The last problem was for installing wxhaskell from the source. The first part compiled just fine, but the second wouldn't with a package problem. I didn't want to bother searching what the problem was, so I thought it was faster to reinstall ghc and compile wxhaskell then. (it worked) Also, I think it's been a while I managed to do a cabal upgrade which didn't stop on a dependency issue.
Since I didn't write down the exact problems I had, I'm attempting a fresh install, and I'll write down what happens as I go.
1) Installing GHC 6.10.2 from the tarball, I decided to give it a try to ./configure --prefix="/home/david/local" 2) Adding /home/david/local' to my PATH 3) I find a binary for cabal-install 0.6.0, 4) cabal update 5) cabal install cabal-install
Proceeds to download and compile HTTP-4000.0.6, then zlib-0.5.0.0 which fails because I don't have zlib.h on this new system.
david@pcdavid2:~$ sudo apt-get install ... well there is no zlib-dev, libzlib-dev available on Jaunty. there is a zlib1-dev which fails to install, and a zlib1g-dev which works.
david@pcdavid2:~$ cabal-0.6.0 install cabal-install again. This time zlib-0.5.0.0 compiles, but then : /usr/bin/ld: cannot find -lgmp
The ghc installer should really check for this at install time rather than us waiting for the first time you compile something for it to fail.
david@pcdavid2:~$ sudo apt-get install libgmp3-dev
david@pcdavid2:~$ download/cabal-0.6.0 install cabal-install againagain.
This time all goes well except that: Installing executable(s) in /home/david/.cabal/bin why the hell would cabal install binaries in a subdirectory of a hidden directory. Why not /home/david/bin or /home/david/local/bin ?
Yes, this is clearly suboptimal but getting agreement on where to put it has not proved easy. There are users that will scream and shout if we install to $HOME/bin by default. Please add your thoughts on the best default behaviour to this ticket: http://hackage.haskell.org/trac/hackage/ticket/289 In my opinion, what we should do is something like: * By default use symlink-bindir: ~/bin if that directory is on the path (creating it if necessary -- but only if the dir was already on the $PATH). In this case we would inform users that's what we've done and about the location of the config file if they want to change it. * If the ~/bin directory is not on the $PATH then we should give a warning that binaries will be installed in ~/.cabal/bin and that the user should either put that on the $PATH or should change the config file to specify a symlink-bindir directory that is on the $PATH. The symlink-bindir feature is "safe" in the sense that we never overwrite files that are not already symlinks to the location where the actual binaries are installed (eg usually ~/.cabal/bin). So in particular we never overwrite any actual binaries you installed there yourself.
david@pcdavid2:~$ ghc-pkg check There are problems in package rts-1.0: include-dirs: PAPI_INCLUDE_DIR doesn't exist or isn't a directory
That's a known bug in ghc-6.10.2 sadly. It means for the 6.10.2 release that ghc-pkg check is not helpful (unless you fix it the way you did).
david@pcdavid2:~$ download/cabal-0.6.0 install cabal-install Linking dist/build/cabal/cabal ... Installing executable(s) in /home/david/.cabal/bin
WTF?
david@pcdavid2:~$ vi .cabal/config install-dirs user -- prefix: /home/david/local -- bindir: $prefix/bin -- libdir: $prefix/lib
I give up for now.
Lines starting with -- are comments. You need to uncomment the prefix line for it to have an effect. The latest version of cabal-install makes a config file with these instructions at the top: -- This is the configuration file for the 'cabal' command line tool. -- The available configuration options are listed below. -- Some of them have default values listed. -- Lines (like this one) beginning with '--' are comments. -- Be careful with spaces and indentation because they are -- used to indicate layout for nested sections. Unfortunately I think you mentioned that you grabbed a binary of an older version so you missed out on the improved instructions. Duncan

On Wed, Apr 22, 2009 at 1:01 PM, Duncan Coutts
On Wed, 2009-04-22 at 12:21 +0200, david48 wrote:
Lines starting with -- are comments. You need to uncomment the prefix line for it to have an effect.
Man do I feel dumb now :) David.

On Wed, 2009-04-22 at 13:20 +0200, david48 wrote:
On Wed, Apr 22, 2009 at 1:01 PM, Duncan Coutts
wrote: On Wed, 2009-04-22 at 12:21 +0200, david48 wrote:
Lines starting with -- are comments. You need to uncomment the prefix line for it to have an effect.
Man do I feel dumb now :)
Don't :-) you're not the only one who got tripped up by this. It looked a lot like -- as in --command-line-flags=. The new text in 0.6.2 makes it clearer. Duncan

Installing executable(s) in /home/david/.cabal/bin why the hell would cabal install binaries in a subdirectory of a hidden directory. Why not /home/david/bin or /home/david/local/bin ?
Yes, this is clearly suboptimal but getting agreement on where to put it has not proved easy. There are users that will scream and shout if we install to $HOME/bin by default.
Having learned from experience that user preferences differ wildly, even on similar platforms, not to mention a variety of platforms or, even worse, intentionally different forks of the same platform, and that trying to guess what defaults might be sensible, let alone acceptable, can be a losing game, I'd like to offer an alternative view: if there is no universally acceptable default, do not use a default Next to not being bothered with configurations they agree with, users like to be in control, or at least be informed about what is going on, and precisely how to change it, *before* anything happens that they do not like. cabal install could, on its first invocation, point to its configuration file, explaining precisely the minimum number of changes required to get it working (with potential defaults being present in the config file, commented out and explained, the config file could be a config mini-tutorial). This would depend on few things to be acceptable: - configuration should be straightforward, with explanations of possible consequences being clear and close at hand; if there is no config file, the tool should be able to generate a partial one (with disputed choices commented out) for further editing - configuration should be persistent (never overwrite old config file without user permission; propagate old config to new tool version) That way, nothing would happen until users are satisfied that things will happen exactly as they like it and, once that is done, they won't have to think about this again (until cabal changes substantially and needs to ask for further user advice, which seems better than silently changing behaviour). If you want cabal to be installable in settings where no user is available, you could either generate a full config file before install, or add an --i-really-don't-care-about-config-settings option. This road isn't perfect, but it can be less unacceptable than any arbitrary set of default choices. Claus

"Claus Reinke"
[...]
+1. That, and better error messages: A Verbose-Consequences flag in the config (on by default), resulting in strings like "Binaries have been installed to $HOME/.cabal/bin and _not_ symlinked. $HOME/.cabal/bin is not in your $PATH: You will not be able to call them directly from the command line, and programs depending on them might be unable to locate them, resulting in failure or limited functionality." Hell, I never thought I'd ever advertise the usage of disclaimers... Gentoo does this, too, btw: "Remember, you have to be in the group "games" to play games", "You chose to build firefox with official branding, distributing the binary (even in your local network) might result in legal problems with the Mozilla Foundation." -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

Claus Reinke wrote:
Installing executable(s) in /home/david/.cabal/bin why the hell would cabal install binaries in a subdirectory of a hidden directory. Why not /home/david/bin or /home/david/local/bin ?
Yes, this is clearly suboptimal but getting agreement on where to put it has not proved easy. There are users that will scream and shout if we install to $HOME/bin by default.
Having learned from experience that user preferences differ wildly, even on similar platforms, not to mention a variety of platforms or, even worse, intentionally different forks of the same platform, and that trying to guess what defaults might be sensible, let alone acceptable, can be a losing game, I'd like to offer an alternative view:
if there is no universally acceptable default, do not use a default
+1. Given that cabal has a config file already and so users don't have to enter everything on the commandline, this seems like a remarkably straightforward solution. The only downside seems like a minor startup cost (in exchange for the major restart cost from defaults having been bad). -- Live well, ~wren

"Richard O'Keefe"
This is good advice (/usr/local is fine though).
Actually, no, it isn't. To start with, these days it's chock full of stuff which is hardly less critical for system operation than anything you'll find in /bin.
More importantly, /usr/local is a bugger to manage by hand, even if the sources came with uninstall capabilities, chances are they're not around, anymore, or messed up. I prefer to have stuff in /opt/<package name>, with links to the binaries in /opt/bin. Manual user installs go into ~/opt, one-file stuff (mostly self-written) into ~/bin. There's also some scripts in ~/dos to start some games with dosbox, but those don't count. -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

Achim Schneider
"Richard O'Keefe"
wrote: This is good advice (/usr/local is fine though).
Actually, no, it isn't. To start with, these days it's chock full of stuff which is hardly less critical for system operation than anything you'll find in /bin.
More importantly, /usr/local is a bugger to manage by hand, even if the sources came with uninstall capabilities, chances are they're not around, anymore, or messed up.
/usr/local is the default prefix in FreeBSD ports system. -- c/* __o/* <\ * (__ */\ <

Thomas Davie schrieb:
On 19 Apr 2009, at 00:31, Antoine Latter wrote:
... Apparently a "user" install of uuagc and fgl isn't good enough. Fun to know.
I've found user installs don't work at all on OS X, various people in #haskell were rather surprised to discover this, so apparently it's not the default behavior on other platforms.
Why do user installs 'not work at all' on OS X ?? I'm on OS X 10.5, I always use cabal install (user), and never encountered any problems. The only caveat duncan mentioned is to pass --user to 'runhaskell Setup.hs' (which I never use, because there's cabal). For me, following 'cabal install fgl uulib uuagc', uhc compiled without any problems. There is a small issue (5) with 'make install' though: The 'bin' directory is not created if neccessary. benedikt

atze@cs.uu.nl writes:
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features
Why? Is there something about Haskell 98 that's hard to implement?
plus many experimental extensions.
fair enough, but it always puzzles my why implementations of compilers almost never start with a complete standard language. -- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)

Jon Fairbairn
atze@cs.uu.nl writes:
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features
Why? Is there something about Haskell 98 that's hard to implement?
Insanity. I doubt anyone is going to miss n+k patterns: http://www.cs.uu.nl/wiki/bin/view/Ehc/LanguageFeatures -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

On 20 Apr 2009, at 12:52 am, Achim Schneider wrote:
Why? Is there something about Haskell 98 that's hard to implement?
Insanity. I doubt anyone is going to miss n+k patterns:
They are one of my favourite features. They express briefly and neatly what would otherwise take several separate steps to express.

Achim Schneider
Jon Fairbairn
wrote: atze@cs.uu.nl writes:
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features
Why? Is there something about Haskell 98 that's hard to implement?
Insanity. I doubt anyone is going to miss n+k patterns:
That (taken with the followup from Richard O'Keefe saying he does use them) underlines my point, really. What follows is specific to Haskell, but the general point applies to most languages I've encountered. I have no love for n+k patterns, but they are part of Haskell98 -- and were the subject of protracted arguments for and against them before the Report was finished (I was against them, if I remember correctly). Any implementation claiming to be of Haskell98 should have them, whether or not the implementor likes them, because otherwise someone will come along with a valid Haskell98 programme and it won't compile, so they'll have to hack it around. This sort of thing (and resulting #ifdef all over the place) wastes far more programmer time in the end (assuming the compiler becomes popular) than it would take to implement the feature. It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago. -- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)

I disagree. First of all, UHC states explicitly that some features are not supported (and probably never would be). Secondly, it seems like almost nobody uses (n+k)-patterns, and when they are used, they make the code less readable; so it's good NOT to support them, in order to make programmers avoid them as much as possible. I don't think #ifdef's would be really "all over the place"; it's more likely that a minor refactoring would take place so that (n+k)-patterns would disappear. Jon Fairbairn wrote on 20.04.2009 13:59:
Achim Schneider
writes: Jon Fairbairn
wrote: atze@cs.uu.nl writes:
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features
Why? Is there something about Haskell 98 that's hard to implement?
Insanity. I doubt anyone is going to miss n+k patterns:
That (taken with the followup from Richard O'Keefe saying he does use them) underlines my point, really. What follows is specific to Haskell, but the general point applies to most languages I've encountered.
I have no love for n+k patterns, but they are part of Haskell98 -- and were the subject of protracted arguments for and against them before the Report was finished (I was against them, if I remember correctly). Any implementation claiming to be of Haskell98 should have them, whether or not the implementor likes them, because otherwise someone will come along with a valid Haskell98 programme and it won't compile, so they'll have to hack it around. This sort of thing (and resulting #ifdef all over the place) wastes far more programmer time in the end (assuming the compiler becomes popular) than it would take to implement the feature.
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago.

Miguel Mitrofanov wrote:
Jon Fairbairn wrote on 20.04.2009 13:59:
Achim Schneider
writes: Jon Fairbairn
wrote: atze@cs.uu.nl writes:
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features
Why? Is there something about Haskell 98 that's hard to implement?
Insanity. I doubt anyone is going to miss n+k patterns:
That (taken with the followup from Richard O'Keefe saying he does use them) underlines my point, really. What follows is specific to Haskell, but the general point applies to most languages I've encountered.
I have no love for n+k patterns, but they are part of Haskell98 -- and were the subject of protracted arguments for and against them before the Report was finished (I was against them, if I remember correctly). Any implementation claiming to be of Haskell98 should have them, whether or not the implementor likes them, because otherwise someone will come along with a valid Haskell98 programme and it won't compile, so they'll have to hack it around. This sort of thing (and resulting #ifdef all over the place) wastes far more programmer time in the end (assuming the compiler becomes popular) than it would take to implement the feature.
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago.
I disagree. First of all, UHC states explicitly that some features are not supported (and probably never would be). Secondly, it seems like almost nobody uses (n+k)-patterns, and when they are used, they make the code less readable; so it's good NOT to support them, in order to make programmers avoid them as much as possible. I don't think #ifdef's would be really "all over the place"; it's more likely that a minor refactoring would take place so that (n+k)-patterns would disappear.
In addition, (n+k) patterns will be removed from the standard as soon as the Haskell prime process produces a new one, so people who want to make their code support that new standard should be removing them right now. Ganesh =============================================================================== Please access the attached hyperlink for an important electronic communications disclaimer: http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html ===============================================================================

On 20 Apr 2009, at 10:12 pm, Miguel Mitrofanov wrote:
I disagree. First of all, UHC states explicitly that some features are not supported (and probably never would be). Secondly, it seems like almost nobody uses (n+k)-patterns,
How can you possibly know that?
and when they are used, they make the code less readable;
I use them because they make the code *MORE* readable. Some versions of the 'view' idea can be used to fake n+k patterns, so I imagine that this argument will be used to ensure that UHC never ever supports anything resembling views. If I want something that's almost a Haskell compiler but not quite, with some interesting extensions, I've known where to find Clean for a long time.

On 21 Apr 2009, at 04:59, Richard O'Keefe wrote:
On 20 Apr 2009, at 10:12 pm, Miguel Mitrofanov wrote:
I disagree. First of all, UHC states explicitly that some features are not supported (and probably never would be). Secondly, it seems like almost nobody uses (n+k)-patterns,
How can you possibly know that?
I can't; that's why I've said "seems like".
If I want something that's almost a Haskell compiler but not quite, with some interesting extensions, I've known where to find Clean for a long time.
That's a strange desire. Personally, I want a compiler for a nice, simple, and useful language. I don't want a Haskell compiler; it just so happens that GHC is the best approximation.

On Mon, Apr 20, 2009 at 9:45 PM, Miguel Mitrofanov
On 21 Apr 2009, at 04:59, Richard O'Keefe wrote:
On 20 Apr 2009, at 10:12 pm, Miguel Mitrofanov wrote:
I disagree. First of all, UHC states explicitly that some features are not supported (and probably never would be). Secondly, it seems like almost nobody uses (n+k)-patterns,
How can you possibly know that?
I can't; that's why I've said "seems like".
Plus, there was a movement to ban them: http://www.mail-archive.com/haskell@haskell.org/msg01261.html All jesting aside, after I read article I was even less eager to use them, and I never thought to use them in a real program anyway. BUT, here is the real point of my reply: To end this debate as to whether people really use them. We have this huge collection of source code called Hackage. I bet that if someone with haskell-src-ext experience sat down they could go through all of package in an automated way and count the number of uses of n+k patterns in source code that appears in the wild. Jason

On 21 Apr 2009, at 5:10 pm, Jason Dagit wrote:
Plus, there was a movement to ban them:
And somehow this means people don't?
BUT, here is the real point of my reply:
To end this debate as to whether people really use them. We have this huge collection of source code called Hackage. I bet that if someone with haskell-src-ext experience sat down they could go through all of package in an automated way and count the number of uses of n+k patterns in source code that appears in the wild.
I'm sorry, that wouldn't even come *close* to answering the question. It's a good way to demonstrate that people *are* using some feature (like hierarchical package names), but an incredibly bad way to show that they aren't. None of the Haskell code I've ever written, for example, will appear. Because none of that code was intended for general use. If every Haskell user contributed to Hackage, and if every contributer to Hackage contributed all the code they wrote, then it would make sense. In the Erlang mailing list, I frequently use the technique of trawling through publically available Erlang sources to demonstrate that features people claim are rare are not. But I'd never be silly enough to claim on the basis of such a scan that some feature _wasn't_ being used extensively in other sources. If the Haskell Great Powers decide to remove n+k patterns, so be it. I can live with that. It's not my language after all. I'm profoundly grateful to the people who designed and implemented it, and who keep stretching my mind with new levels of reuse and composition. I won't _like_ the loss of a contributor to readability, but I can live with it, just as I lived with having to use fmap instead of map.

On Mon, Apr 20, 2009 at 10:21 PM, Richard O'Keefe
On 21 Apr 2009, at 5:10 pm, Jason Dagit wrote:
Plus, there was a movement to ban them:
And somehow this means people don't?
...see the humor.
BUT, here is the real point of my reply:
To end this debate as to whether people really use them. We have this huge collection of source code called Hackage. I bet that if someone with haskell-src-ext experience sat down they could go through all of package in an automated way and count the number of uses of n+k patterns in source code that appears in the wild.
I'm sorry, that wouldn't even come *close* to answering the question. It's a good way to demonstrate that people *are* using some feature (like hierarchical package names), but an incredibly bad way to show that they aren't.
Not really. Obviously some programs use the feature, but let us restrict to interesting programs that have been shared with the world and have some potential to receive maintenance. From these programs we can do a sampling. While I'm not a statistics expert, my understanding is the main problem with using hackage packages is a bit of selection bias. I bet the selection bias isn't even that bad for this statistical test due to the nature of programming style diversity. Maybe someone with a stronger stats background could comment.
If every Haskell user contributed to Hackage, and if every contributer to Hackage contributed all the code they wrote, then it would make sense.
I think that would give us an exhaustive collection of haskell code, but I assert we don't need that. Biologists don't need a DNA sample from every organism to draw conclusions about the genetics of a species. Scientists work with incomplete data and draw sound conclusions in spite of that. The tools they use to do so are known as statistics.
In the Erlang mailing list, I frequently use the technique of trawling through publically available Erlang sources to demonstrate that features people claim are rare are not. But I'd never be silly enough to claim on the basis of such a scan that some feature _wasn't_ being used extensively in other sources.
Okay, then prove n+k patterns are not rare in the publicly available sources. That's the challenge I was trying to make in my first email. My apology for not being more direct in the asking. Jason

On 21 Apr 2009, at 7:39 pm, Jason Dagit wrote:
Not really. Obviously some programs use the feature, but let us restrict to interesting programs that have been shared with the world and have some potential to receive maintenance.
Why? You are, in effect, saying that my code has no value at all. You are saying that code written by students has no value at all. Why do you think that only code that is "shared with the world" has "some potential to receive maintenance"? By the way, not all publicly available code is in Hackage. The hbc release that's on my SPARC -- and thankful I've been for it, the grief GHC has given me there -- has at least one use of an n+k pattern that I know of, and more in a specification comment.
From these programs we can do a sampling. While I'm not a statistics expert, my understanding is the main problem with using hackage packages is a bit of selection bias.
I can see no reason to assume that it's only "a bit". Maybe it's a bit. Maybe it's a very great deal. It would be interesting to investigate this, but the only way you can investigate it is to examine a lot of code that _isn't_ there.
I bet the selection bias isn't even that bad for this statistical test due to the nature of programming style diversity. Maybe someone with a stronger stats background could comment.
I have a statistics degree. I don't know if that's strong enough for you. It's strong enough that I assume selection bias until I see evidence otherwise.
I think that would give us an exhaustive collection of haskell code, but I assert we don't need that. Biologists don't need a DNA sample from every organism to draw conclusions about the genetics of a species.
It depends on what _kind_ of conclusion they want to draw. If they want to show that some feature _is_ present, a sample will do. If they want to show that it's absent or rare, then they need a much bigger sample.
Scientists work with incomplete data and draw sound conclusions in spite of that. The tools they use to do so are known as statistics.
Yes, I know. That's why I get cross when people suggest silly things like trawling through Hackage to demonstrate that nobody is using n+k patterns. Where's the statistics in that? Where are the estimates of natural variation? Note: I do not assert that the use of n_k patterns is rare. Here's _all_ that I assert about n+k patterns: (1) they are part of Haskell 98 (2) I believe they make programs more readable (3) I use them (4) they are no worse than certain features proposed for addition to Haskell'.
Okay, then prove n+k patterns are not rare in the publicly available sources.
Why the X should I? I do not claim that they are common IN THE PUBLICLY AVAILABLE SOURCES, I have NEVER claimed that, and I don't even CARE whether they are rare in the publicly available sources or not. Because they make programs more readable, n+k patterns probably *should* be moderately common in the publicly available sources, but I have no idea whether they are or not. It *is* true that things that *are* used in the commonly available sources should continue to be supported in order to preserve the value of those commonly available sources. It is *not* true that things that are *not* used in the commonly available sources are therefore of no value and safely to be discarded.
That's the challenge I was trying to make in my first email.
It's a challenge to the irrelevant. Let's consider three versions of naive Fibonacci. fibA :: (Integral a, Integral b) => a -> b fibA 0 = 1 fibA 1 = 1 fibA (n+2) = fibA (n+1) + fibA n Simple, readable, Haskell 98. pred2 :: (Integral a) => a -> Maybe a pred2 n = if n >= 2 then Just (n-2) else Nothing fibB :: (Integral a, Integral b) => a -> b fibB 0 = 1 fibB 1 = 1 fibB x | Just n <- pred2 x = fibB (n+1) + fibB n Uses a pattern guard, implemented in GHC. Pattern guards are neat. I like them a lot. They make sense. But it's impossible for me to see fibB as more readable than fibA. While pattern guards come _close_ to offering a replacement for n+k patterns, they don't quite. If I had f x (n+1) | p x = ... | q x = ... I would have to write the pattern guard twice as f x n' | Just n <- pred1 n', p x = ... | Just n <- pred1 n', q x = ... That doesn't seem like an advantage, somehow. Now for the third alternative. The view proposal in the Haskell' wiki and the views implemented in GHC are different. In fact the view proposal document goes to some trouble to show how views can replace n+k patterns, so I suppose I don't need to review that. Here's what it looks like using GHC syntax. (I can't make ghc 6.8.3 accept this; ghc --supported-languages does not list ViewPatterns. So this is UNTESTED CODE!) data Integral a => Nat2 a = Succ2 a | One2 | Zero2 nat2 :: Integral a => a -> Nat2 a nat2 n | n >= 2 = Succ2 (n-2) nat2 1 = One2 nat2 0 = Zero2 nat2 _ = error "nat2: not a natural number" fibC (nat2 -> Zero2) = 1 fibC (nat2 -> One2) = 1 fibC (nat2 -> Succ2 n) = fibC (n+1) + fibC n I like views a lot. The GHC version of views seems particularly tidy. But again, does anyone really think this makes the code *more* readable? I suppose I should include the 4th version: fibD :: (Integral a, Integral b) => a -> b fibD 0 = 1 fibD 1 = 1 fibD n = if n >= 2 then fibD (n-1) + fibD (n-2) else error "fibD: not a natural number" That doesn't look like an improvement in readability or maintainability or any other illity to me.

On Tue, Apr 21, 2009 at 8:34 PM, Richard O'Keefe
On 21 Apr 2009, at 7:39 pm, Jason Dagit wrote:
Not really. Obviously some programs use the feature, but let us restrict to interesting programs that have been shared with the world and have some potential to receive maintenance.
Why?
You are, in effect, saying that my code has no value at all. You are saying that code written by students has no value at all. Why do you think that only code that is "shared with the world" has "some potential to receive maintenance"?
By the way, not all publicly available code is in Hackage. The hbc release that's on my SPARC -- and thankful I've been for it, the grief GHC has given me there -- has at least one use of an n+k pattern that I know of, and more in a specification comment.
From these programs we can do a sampling. While I'm not a statistics expert, my understanding is the main problem with using hackage packages is a bit of selection bias.
I can see no reason to assume that it's only "a bit". Maybe it's a bit. Maybe it's a very great deal. It would be interesting to investigate this, but the only way you can investigate it is to examine a lot of code that _isn't_ there.
I bet the selection bias isn't even that bad for this statistical test due to the nature of programming style diversity. Maybe someone with a stronger stats background could comment.
I have a statistics degree. I don't know if that's strong enough for you. It's strong enough that I assume selection bias until I see evidence otherwise.
I think that would give us an exhaustive collection of haskell code, but I assert we don't need that. Biologists don't need a DNA sample from every organism to draw conclusions about the genetics of a species.
It depends on what _kind_ of conclusion they want to draw. If they want to show that some feature _is_ present, a sample will do. If they want to show that it's absent or rare, then they need a much bigger sample.
Scientists work with incomplete data and draw sound conclusions in spite of that. The tools they use to do so are known as statistics.
Yes, I know. That's why I get cross when people suggest silly things like trawling through Hackage to demonstrate that nobody is using n+k patterns. Where's the statistics in that? Where are the estimates of natural variation?
Note: I do not assert that the use of n_k patterns is rare. Here's _all_ that I assert about n+k patterns: (1) they are part of Haskell 98 (2) I believe they make programs more readable (3) I use them (4) they are no worse than certain features proposed for addition to Haskell'.
Okay, then prove n+k patterns are not rare in the publicly available sources.
Why the X should I? I do not claim that they are common IN THE PUBLICLY AVAILABLE SOURCES, I have NEVER claimed that, and I don't even CARE whether they are rare in the publicly available sources or not.
Because they make programs more readable, n+k patterns probably *should* be moderately common in the publicly available sources, but I have no idea whether they are or not.
It *is* true that things that *are* used in the commonly available sources should continue to be supported in order to preserve the value of those commonly available sources. It is *not* true that things that are *not* used in the commonly available sources are therefore of no value and safely to be discarded.
That's the challenge I was trying to make in my first email.
It's a challenge to the irrelevant.
Let's consider three versions of naive Fibonacci.
fibA :: (Integral a, Integral b) => a -> b
fibA 0 = 1 fibA 1 = 1 fibA (n+2) = fibA (n+1) + fibA n
Simple, readable, Haskell 98.
pred2 :: (Integral a) => a -> Maybe a pred2 n = if n >= 2 then Just (n-2) else Nothing
fibB :: (Integral a, Integral b) => a -> b
fibB 0 = 1 fibB 1 = 1 fibB x | Just n <- pred2 x = fibB (n+1) + fibB n
Uses a pattern guard, implemented in GHC.
Pattern guards are neat. I like them a lot. They make sense. But it's impossible for me to see fibB as more readable than fibA.
While pattern guards come _close_ to offering a replacement for n+k patterns, they don't quite. If I had
f x (n+1) | p x = ... | q x = ...
I would have to write the pattern guard twice as
f x n' | Just n <- pred1 n', p x = ... | Just n <- pred1 n', q x = ...
That doesn't seem like an advantage, somehow.
Now for the third alternative. The view proposal in the Haskell' wiki and the views implemented in GHC are different. In fact the view proposal document goes to some trouble to show how views can replace n+k patterns, so I suppose I don't need to review that. Here's what it looks like using GHC syntax. (I can't make ghc 6.8.3 accept this; ghc --supported-languages does not list ViewPatterns. So this is UNTESTED CODE!)
data Integral a => Nat2 a = Succ2 a | One2 | Zero2
nat2 :: Integral a => a -> Nat2 a
nat2 n | n >= 2 = Succ2 (n-2) nat2 1 = One2 nat2 0 = Zero2 nat2 _ = error "nat2: not a natural number"
fibC (nat2 -> Zero2) = 1 fibC (nat2 -> One2) = 1 fibC (nat2 -> Succ2 n) = fibC (n+1) + fibC n
I like views a lot. The GHC version of views seems particularly tidy. But again, does anyone really think this makes the code *more* readable?
I suppose I should include the 4th version:
fibD :: (Integral a, Integral b) => a -> b
fibD 0 = 1 fibD 1 = 1 fibD n = if n >= 2 then fibD (n-1) + fibD (n-2) else error "fibD: not a natural number"
That doesn't look like an improvement in readability or maintainability or any other illity to me.
fibE :: (Integral a, Integral b) => a -> b fibE n | n < 0 = error "fibE: not a natural number" fibE 0 = 1 fibE 1 = 1 fibE n = fibE (n-1) + fibE (n-2) I personally find this a bit easier to read than the n+k one because to think about this, I can just read the formula for the nth fibonacci number. To read the n+k one, I have to look at the pattern, figure out what n is (because it's not the argument to the function), and then look at how the fib function is called. Notice that the n that is passed to the next iterations of fibA does *not* have the same meaning as the n within fibA. Of course, readability is a bit subjective, but that's my point of view. My main point here was to show that you don't need view patterns or pattern guards to implement fib without n+k patterns. Alex

On Tue, Apr 21, 2009 at 8:34 PM, Richard O'Keefe
On 21 Apr 2009, at 7:39 pm, Jason Dagit wrote:
Not really. Obviously some programs use the feature, but let us restrict to interesting programs that have been shared with the world and have some potential to receive maintenance.
Why?
You are, in effect, saying that my code has no value at all. You are saying that code written by students has no value at all. Why do you think that only code that is "shared with the world" has "some potential to receive maintenance"?
Code which is not used has what value exactly? Other than some learning value for the author, what is the purpose? You also misinterpreted my grammar. That "and" is a logical one. For example, I intended to exclude programs which have been shared but have no maintenance.
By the way, not all publicly available code is in Hackage. The hbc release that's on my SPARC -- and thankful I've been for it, the grief GHC has given me there -- has at least one use of an n+k pattern that I know of, and more in a specification comment.
I know not all publicly available code is there. I said as much in my previous mail. If a code base the size of hbc has one use of n+k patterns, I bet you know enough that you could rewrite it to not need it with very little effort, FWIW.
I have a statistics degree. I don't know if that's strong enough for you. It's strong enough that I assume selection bias until I see evidence otherwise.
Then please, apply what you know and let's take the rumor, conjecture, and speculation out of this discussion.
It depends on what _kind_ of conclusion they want to draw. If they want to show that some feature _is_ present, a sample will do. If they want to show that it's absent or rare, then they need a much bigger sample.
And Hackage is the largest sample I can think of. Feel free to provide a larger sample if you have one.
Scientists work with incomplete data and draw sound conclusions in spite of that. The tools they use to do so are known as statistics.
Yes, I know. That's why I get cross when people suggest silly things like trawling through Hackage to demonstrate that nobody is using n+k patterns. Where's the statistics in that? Where are the estimates of natural variation?
Note: I do not assert that the use of n_k patterns is rare. Here's _all_ that I assert about n+k patterns: (1) they are part of Haskell 98 (2) I believe they make programs more readable (3) I use them (4) they are no worse than certain features proposed for addition to Haskell'.
Okay, then prove n+k patterns are not rare in the publicly available sources.
Why the X should I? I do not claim that they are common IN THE PUBLICLY AVAILABLE SOURCES, I have NEVER claimed that, and I don't even CARE whether they are rare in the publicly available sources or not.
Please. Let's remain calm. This is just a thread on a geeky mailing list for Haskell enthusiasts. I'm not trying to raise your taxes, take away any rights, or insult / offend you. Why? Because you're a stats expert so you have the necessary background to do it right. Also, because, hey, it would be interesting to see the numbers. Plus, it seemed relevant to me.
Because they make programs more readable, n+k patterns probably *should* be moderately common in the publicly available sources, but I have no idea whether they are or not.
Personally, when a public debate stops being about provable things I get bored. Unless the next reply to this involves more facts and less speculation I have no plans to reply. Not to be rude, but the thread would be a waste of my time at that point.
It *is* true that things that *are* used in the commonly available sources should continue to be supported in order to preserve the value of those commonly available sources. It is *not* true that things that are *not* used in the commonly available sources are therefore of no value and safely to be discarded.
Ah, I think I see where we disagree. Things which are: 1) rarely used; 2) add complexity to implementations; 3) add complexity for novices; and 4) are easily replaced; should be removed from the specification so that it is simpler. I have a belief that n+k patterns fit the above 4 items and that taken together they are sufficient, although perhaps not necessary or complete in general, as a reason to move towards removing them. I thought it would be interesting to have someone make a measure of #1. I think #3 applies here because it has to be explained. As a programmer, I didn't immediately grok it even though mathematically it makes sense. I doubt any of the reasons taken individually would be enough unless it is an extreme case. I'll also mention that I don't think Fortran is great language just because it takes on the burden of backwards compatibility. I think there should be a balance.
Let's consider three versions of naive Fibonacci. [snip] That doesn't look like an improvement in readability or maintainability or any other illity to me.
I think these examples show that fibs is easy to implement with or without n+k patterns. And we haven't even gotten to the trivially memoized fibs yet. I'm really fond of this one for example: fibs = 1 : 1 : zipWith (+) fibs (tail fibs) Where you get the Nth fib (starting from 0) with fibs !! N. Jason

That's absurd. You have no way to access private source code, so any decision on what features to exclude from future versions of Haskell must necessarily look at publicly accessible source code. The only alternative is to continuously add, and never remove, features from Haskell, even if no one (that we know) uses them. Moreover, the odds that everyone who is using n + k patterns are doing so only in private is an untestable hypothesis (i.e. unscientific) and extremely unlikely to be true. Regards, John A. De Goes N-BRAIN, Inc. The Evolution of Collaboration http://www.n-brain.net | 877-376-2724 x 101 On Apr 21, 2009, at 9:34 PM, Richard O'Keefe wrote:
It *is* true that things that *are* used in the commonly available sources should continue to be supported in order to preserve the value of those commonly available sources. It is *not* true that things that are *not* used in the commonly available sources are therefore of no value and safely to be discarded.

On 23 Apr 2009, at 2:09 am, John A. De Goes wrote:
That's absurd. You have no way to access private source code,
Right.
so any decision on what features to exclude from future versions of Haskell must necessarily look at publicly accessible source code.
Wrong. There is no "necessarily" about it. People made decisions about what to deprecate in the Fortran and COBOL standards without looking at publicly accessible source code. The changes made in producing ECMA Eiffel were _definitely_ done without looking at publicly accessible source code. And the decision to remove n+k patterns from Haskell', wrong though I think it was, was NOT made on a nose-counting basis. Or if it was, there is not the slightest evidence of it in http://hackage.haskell.org/trac/haskell-prime/wiki/RemoveNPlusK The reasons for the removal are, in order - no other data type has it (and how many other data types allow [a,b,c] as well as a:b:c:[]?) - a somewhat bogus claim about how much of the library you need to know how to use it (of COURSE you need to know about integers in order to use an integer operation, what's so bad about that?) - the claim that + doesn't mean + (this is really an argument about the scope of + and could have been dealt with by ruling that n+k is only available when the version of + in scope is the one from the Prelude) - an assertion of personal taste (the side condition that the thing is FOR is called 'ugly') - a pious hope that something might replace them - another assertion of personal taste. That's *it*. There is nothing about the operation being *rare* in any source code, publicly available or otherwise. There is no suggestion that commonly used features would not be proposed for removal. For example, "." as the composition operator, and "~" lazy patterns, have both been proposed for removal.
The only alternative is to continuously add, and never remove, features from Haskell, even if no one (that we know) uses them.
Look again at the Haskell' status page, http://hackage.haskell.org/trac/haskell-prime/wiki/Status Of all the proposed removals, n+k is the ONLY one to have been accepted. It may well be true that someone counted how many publicly accessible modules use those features, but I cannot find any evidence in the Haskell' wiki that this is so, or for that matter that people would have cared very much. (Meyer wasn't bothered by the major incompatibilities between ETL3 Eiffel and ECMA Eiffel.) With this one unique solitary exception, so far Haskell' *is* just adding features to Haskell. So if trawling through publicly available sources is somehow supposed to stop that happening, I'm sorry, but it hasn't done so yet.
Moreover, the odds that everyone who is using n + k patterns are doing so only in private is an untestable hypothesis (i.e. unscientific) and extremely unlikely to be true.
It's also a straw man argument. Nobody says that. I certainly don't. What I DO say is that - MY code contains n+k - MY code is not publicly available (there are many much brighter people than me working on Haskell and whenever I have a cool idea it has so far always been done, and in any case, much of the Haskell code I've written would be of no interest to anyone else) - so the nose counting process would certainly miss ME - there doesn't seem to be any reason to consider me unique We simply DON'T KNOW how many people are using n+k patterns or anything else that people have proposed for removal from Haskell'. One thing we DO know is that existing Haskell textbooks teach the use of n+k patterns. Is there a simple way to download everything from Hackage? As I've said before, trawling through stuff like that _can_ show that something is used, but if it doesn't show that, we do NOT learn that it isn't, but only that we DON'T KNOW. As someone who is only a Haskell user, I don't really have any *rights* in the matter. Using Haskell at all is a privilege, and if the Haskell committee decide, and Haskell implementors agree, to remove a feature that I like, then I just have to live with it. But let's not pretend that this was ever about how many people used the feature.

On Thu, Apr 23, 2009 at 6:30 AM, Richard O'Keefe
- a somewhat bogus claim about how much of the library you need to know how to use it (of COURSE you need to know about integers in order to use an integer operation, what's so bad about that?) - the claim that + doesn't mean + (this is really an argument about the scope of + and could have been dealt with by ruling that n+k is only available when the version of + in scope is the one from the Prelude)
What's bogus about that claim? Then n+k patterns have type (Integral a) => a, so you need to know about type classes and Integral. Even if it's listed as a reason, you rest assure that the Haskell' committee did consider how widespread the use of n+k was before removing it. Of course, this can only be an educated guess. Of course, n+k will be missed by Haskell obfuscators. I mean, what will we do without (+) + 1 + 1 = (+) ? -- Lennart

"Lennart" == Lennart Augustsson
writes:
Lennart> Of course, n+k will be missed by Haskell obfuscators. I Lennart> mean, what will we do without (+) + 1 + 1 = (+) ? I think what would be missed would you be having the opportunity to explain to me what it means. But as we still have them, go right ahead (please). -- Colin Adams Preston Lancashire

Let me parenthesise and rename
(n + 1) +++ 1 = n
This defines a function +++, first argument is a n+1 pattern, second
argument is 1.
In the same way,
(+) + 1 + 1 = (+)
defines a function +, first argument is n+1 (but using (+) as n),
second argument is 1.
On Thu, Apr 23, 2009 at 10:27 AM, Colin Paul Adams
"Lennart" == Lennart Augustsson
writes: Lennart> Of course, n+k will be missed by Haskell obfuscators. I Lennart> mean, what will we do without (+) + 1 + 1 = (+) ?
I think what would be missed would you be having the opportunity to explain to me what it means.
But as we still have them, go right ahead (please). -- Colin Adams Preston Lancashire

On 23 Apr 2009, at 9:45 pm, Lennart Augustsson wrote:
(+) + 1 + 1 = (+)
The thing that is bad about this is that it binds the same identifier "+" twice in the head, to two different things. It's not really any different from (+) + 1 = (+) which doesn't have anything to do with n+k patterns, but is obnoxious in the same way. If Haskell' removes the ability to bind the same name two different ways in the same left-hand-side, I for one won't weep, _however_ many people currently do it.

Colin Paul Adams
"Lennart" == Lennart Augustsson
writes: Lennart> Of course, n+k will be missed by Haskell obfuscators. I Lennart> mean, what will we do without (+) + 1 + 1 = (+) ?
I think what would be missed would you be having the opportunity to explain to me what it means.
It means the same as (+) ((+) + 1) 1 = (+) HTH &c. -- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)

On 23 Apr 2009, at 9:02 pm, Lennart Augustsson wrote:
On Thu, Apr 23, 2009 at 6:30 AM, Richard O'Keefe
wrote: - a somewhat bogus claim about how much of the library you need to know how to use it (of COURSE you need to know about integers in order to use an integer operation, what's so bad about that?) - the claim that + doesn't mean + (this is really an argument about the scope of + and could have been dealt with by ruling that n+k is only available when the version of + in scope is the one from the Prelude)
What's bogus about that claim? Then n+k patterns have type (Integral a) => a, so you need to know about type classes and Integral.
One thing that is "bogus" is the apparent suggestion that the amount you have to know about the library is EXCESSIVE. The other thing is that in fact it's not true. Students can perfectly well learn how to use n+k patterns at a time when the only numeric type they know about is Int. When they learn that there are other numeric types, they have to learn about type classes if they want to understand what they are doing when they use ">=" or "-". So it simply isn't true that understanding n+k patterns requires any *MORE* understanding of type classes than understanding the contorted code required to replace them.
Even if it's listed as a reason, you rest assure that the Haskell' committee did consider how widespread the use of n+k was before removing it. Of course, this can only be an educated guess.
I did try finding something about the reasons for the choices made, but in this case was unable to find anything beyond the originally listed pros and cons. Given Haskell's "avoid success" ethos, I'm actually of the opinion that the Haskell' committee SHOULDN'T consider how widely spread the use of some feature is or isn't. It's good if they don't go the CORBA route and standardise features that have never been implemented, let alone used. I can imagine very widely used features being RIGHTLY removed, as Classic C function headers were removed from C99.
Of course, n+k will be missed by Haskell obfuscators. I mean, what will we do without (+) + 1 + 1 = (+) ?
Haskell obfuscators have such a vast playground that the loss of n+k will go completely unnoticed by them.

On Apr 22, 2009, at 11:30 PM, Richard O'Keefe wrote:
so any decision on what features to exclude from future versions of Haskell must necessarily look at publicly accessible source code.
Wrong. There is no "necessarily" about it. People made decisions about what to deprecate in the Fortran and COBOL standards without looking at publicly accessible source code.
I'm talking about the only rational way to do it. It's irrational for language maintainers to remove features that are widely used in publicly available source code, as such changes are detrimental to the success of the language. Thus, rational language maintainers will decide what features to exclude from future versions of Haskell by looking at publicly accessible source code.
And the decision to remove n+k patterns from Haskell', wrong though I think it was, was NOT made on a nose-counting basis. Or if it was, there is not the slightest evidence of it in http://hackage.haskell.org/trac/haskell-prime/wiki/RemoveNPlusK
Numerous people have contributed to that discussion, and the consensus is that too few people are using them to justify the feature. Indeed, in this very discussion, you are the only one advocating them. What does that tell you?
It's also a straw man argument. Nobody says that. I certainly don't. What I DO say is that - MY code contains n+k
Why should this matter? The contents of your code are not relevant to the future of Haskell. n = 1 is meaningless.
Is there a simple way to download everything from Hackage?
One would need to write a script to do this. Regards, John A. De Goes N-BRAIN, Inc. The Evolution of Collaboration http://www.n-brain.net | 877-376-2724 x 101

On 24 Apr 2009, at 3:23 am, Ross Paterson wrote:
On Thu, Apr 23, 2009 at 05:30:52PM +1200, Richard O'Keefe wrote:
Is there a simple way to download everything from Hackage?
There's a link on the HackageDB introduction page.
Got it. Thanks!

"John A. De Goes"
That's absurd. You have no way to access private source code, so any decision on what features to exclude from future versions of Haskell must necessarily look at publicly accessible source code.
This is all entirely beside the point. The question is not whether n+k patterns should be in the language, it's whether an implementation of Haskell 98 should include them.
The only alternative is to continuously add, and never remove, features from Haskell, even if no one (that we know) uses them.
But we can remove them in future language versions. The point I was trying to make at the beginning of this subthread was that implementations should follow the definition, because having a core language (Haskell 98) that can be relied on is simpler and wastes less time than the alternative. -- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)

Jon Fairbairn wrote:
"John A. De Goes"
writes: That's absurd. You have no way to access private source code, so any decision on what features to exclude from future versions of Haskell must necessarily look at publicly accessible source code.
This is all entirely beside the point. The question is not whether n+k patterns should be in the language, it's whether an implementation of Haskell 98 should include them.
The only alternative is to continuously add, and never remove, features from Haskell, even if no one (that we know) uses them.
But we can remove them in future language versions. The point I was trying to make at the beginning of this subthread was that implementations should follow the definition, because having a core language (Haskell 98) that can be relied on is simpler and wastes less time than the alternative.
There has to be a bit of give and take here between standards and implementations. The Haskell 98 standard is now very old and becoming increasingly less relevant, hence the Haskell' effort. (n+k) patterns were always controversial and the decision to include them has indeed been reversed by the Haskell' committee. So I would say that {Haskell 98 - (n+k)} is itself a worthwhile standard to implement. UHC is clear that this is what it has implemented, so it's not as if they are misrepresenting themselves. Ganesh =============================================================================== Please access the attached hyperlink for an important electronic communications disclaimer: http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html ===============================================================================

"Sittampalam, Ganesh"
Jon Fairbairn wrote:
But we can remove them in future language versions. The point I was trying to make at the beginning of this subthread was that implementations should follow the definition, because having a core language (Haskell 98) that can be relied on is simpler and wastes less time than the alternative.
There has to be a bit of give and take here between standards and implementations.
There is no such compulsion. There's an excellent case for information from implementors and programmers to feed experience into future standards, but that's not a reason for implementing something part way.
The Haskell 98 standard is now very old and becoming increasingly less relevant, hence the Haskell' effort. (n+k) patterns were always controversial and the decision to include them has indeed been reversed by the Haskell' committee.
But there is no retroactive removal from Haskell 98.
So I would say that {Haskell 98 - (n+k)} is itself a worthwhile standard to implement.
It's not a standard. You have to document the difference (waste of time), programmers have to notice the difference (waste of time), books that describe H 98 no longer apply (waste of effort). You can argue that the wastes here are individually small, but you have to multiply them by the number of times they happen (and again, I'm taking n+k as an example of a general problematic attitude that's been with us since FORTRAN I*, rather than really arguing about n+k specifically). [*] The FORTRAN IV standard contains some really quite entertaining examples of what happens when you try to standardise the intersection of divergent implementations of a programming language. -- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)

Jon Fairbairn wrote:
"Sittampalam, Ganesh"
writes:
So I would say that {Haskell 98 - (n+k)} is itself a worthwhile standard to implement.
It's not a standard. You have to document the difference (waste of time), programmers have to notice the difference (waste of time), books that describe H 98 no longer apply (waste of effort).
Interestingly, the removal discussion from Haskell' (http://hackage.haskell.org/trac/haskell-prime/wiki/RemoveNPlusK) explicitly mentions "some Haskell books use it (this was the main reason it was kept in Haskell 98)" and also points out that the report explicitly warned that they might be removed in future. Presumably those were books about Haskell 1.4 or before. If n+k was only kept to keep those books still valid, then they certainly shouldn't survive any longer; any H98 books that used them deserve their fate, IMO.
You can argue that the wastes here are individually small, but you have to multiply them by the number of times they happen (and again, I'm taking n+k as an example of a general problematic attitude that's been with us since FORTRAN I*, rather than really arguing about n+k specifically).
[*] The FORTRAN IV standard contains some really quite entertaining examples of what happens when you try to standardise the intersection of divergent implementations of a programming language.
I'd be much more inclined to agree with you if the example in question was not n+k. Also, divergence by omission of features is much easier to recover from than mutually incompatible implementation of the same feature. Ganesh =============================================================================== Please access the attached hyperlink for an important electronic communications disclaimer: http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html ===============================================================================

Let's turn this around. You invest 4 months of your life coming out with your own experimental Haskell compiler designed to easily test new language features. Then a bunch of ungrateful wretches on Haskell Cafe demand that you stop distributing your compiler until you have full support for Haskell 98. :-) Do you think that's fair? Regards, John A. De Goes N-BRAIN, Inc. The Evolution of Collaboration http://www.n-brain.net | 877-376-2724 x 101 On Apr 23, 2009, at 3:18 AM, Jon Fairbairn wrote:
"John A. De Goes"
writes: That's absurd. You have no way to access private source code, so any decision on what features to exclude from future versions of Haskell must necessarily look at publicly accessible source code.
This is all entirely beside the point. The question is not whether n+k patterns should be in the language, it's whether an implementation of Haskell 98 should include them.
The only alternative is to continuously add, and never remove, features from Haskell, even if no one (that we know) uses them.
But we can remove them in future language versions. The point I was trying to make at the beginning of this subthread was that implementations should follow the definition, because having a core language (Haskell 98) that can be relied on is simpler and wastes less time than the alternative.
-- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Am Donnerstag 23 April 2009 16:13:36 schrieb John A. De Goes:
Let's turn this around. You invest 4 months of your life coming out with your own experimental Haskell compiler designed to easily test new language features. Then a bunch of ungrateful wretches on Haskell Cafe demand that you stop distributing your compiler until you have full support for Haskell 98. :-)
Do you think that's fair?
Regards,
John A. De Goes
Well, if it doesn't implement the full standard, perhaps it should rather be called UVNABNQHC (Utrecht very nearly, almost but not quite Haskell compiler)?

Let's turn this around. You invest 4 months of your life coming out with your own experimental Haskell compiler designed to easily test new language features. Then a bunch of ungrateful wretches on Haskell Cafe demand that you stop distributing your compiler until you have full support for Haskell 98. :-)
Do you think that's fair?
|Well, if it doesn't implement the full standard, perhaps it should rather be called |UVNABNQHC (Utrecht very nearly, almost but not quite Haskell compiler)? uHC: unsafeCompileHaskell ?-) joking and bikeshedding aside: - Haskell'98 is a fixed standard. Haskell'98 (revised) is a revised version of the same standard. The discussion on what is in either is over. Unless someone wants to start and edit a new revision of Haskell'98. Or someone wants to write about experience with and criticism of the existing standards. None of which seems to relate to this thread's subject, though either would fit into other threads on this mailing list. - the UHC announcement states (emphasis added): "UHC supports _almost all_ Haskell98 features plus many experimental extensions". Once they start claiming to have a full Haskell'98 implementation, everybody can start filing bug reports. Actually, you can start doing that now as they explicitly relate UHC to Haskell'98, not Haskell, not Haskell'. But once you've filed a bug report about a deviation from the version of the standard being referred to, it is up to them. - there are one or two more interesting things to discuss about UHC. That would require some actual work to find out more about it. - implementing a Haskell compiler requires a lot of work. So does detailing language extensions, to say nothing about providing supporting evidence for suggested language extensions by actually implementing them side-by-side with Haskell's other features. - anyone who gets through the work of implementing something, let alone a Haskell compiler, to the stage of releasing/announcing it, is likely looking forward to getting feedback on their work. In reality, the only feedback most projects get is from bug reports (and not always those), web access logs, and rumours on blogs or irc. One really, really, does not need one's project name to be used for other unrelated entertainment as well. May I respectfully suggest that further postings under _this_ subject give something back to the UHC implementers, in the form of investing some actual work and time to find out about the fruits of their work? Claus PS. Sorry for going meta about this. Just one reader's and fellow programmer's over-sensitive opinion. Feel free to colour bikesheds or have interesting discussions on non-UHC-specific Haskell standard issues, in non-UHC-specific threads. Or to ignore this suggestion alltogether. PPS. If you want to see future announcements of real software appear on the haskell@ list only, excluding the haskell-bikeshed@ list. That is assuming that people will still be motivated to implement such software, if vapourware could trigger the same responses.

On Thu, Apr 23, 2009 at 12:39 PM, Claus Reinke
joking and bikeshedding aside:
- Haskell'98 is a fixed standard. Haskell'98 (revised) is a revised version of the same standard. The discussion on what is in either is over. Unless someone wants to start and edit a new revision of Haskell'98. Or someone wants to write about experience with and criticism of the existing standards. None of which seems to relate to this thread's subject, though either would fit into other threads on this mailing list.
- the UHC announcement states (emphasis added): "UHC supports _almost all_ Haskell98 features plus many experimental extensions". Once they start claiming to have a full Haskell'98 implementation, everybody can start filing bug reports. Actually, you can start doing that now as they explicitly relate UHC to Haskell'98, not Haskell, not Haskell'. But once you've filed a bug report about a deviation from the version of the standard being referred to, it is up to them.
- there are one or two more interesting things to discuss about UHC. That would require some actual work to find out more about it.
- implementing a Haskell compiler requires a lot of work. So does detailing language extensions, to say nothing about providing supporting evidence for suggested language extensions by actually implementing them side-by-side with Haskell's other features. - anyone who gets through the work of implementing something, let alone a Haskell compiler, to the stage of releasing/announcing it, is likely looking forward to getting feedback on their work.
In reality, the only feedback most projects get is from bug reports (and not always those), web access logs, and rumours on blogs or irc. One really, really, does not need one's project name to be used for other unrelated entertainment as well.
May I respectfully suggest that further postings under _this_ subject give something back to the UHC implementers, in the form of investing some actual work and time to find out about the fruits of their work?
Claus
I'd like to second this email. I found the ehc/uhc project very interesting when I was looking at it a year or two ago, and I'm a little distressed that this thread has been so unproductive and basically hostile. I was hoping that comments would be more substantive, rather than carping about what a maintainer plans on adding (and thereby triggering an apparent holy war). For example, I expected someone to ask why it was not cabalized since that would help distribution; to which a developer could respond that it could well be except source files need to be preprocessed with the grammar-conversion tool (UUAGC?) and Cabal doesn't support that like it does alex/happy; to which someone might propose a hack-around using GHC's -Fgetc. option, or maybe someone would go quickly add support to Cabal and we could get started on Cabalizing the various compilers - Er. Not to try to force the discussion in any particular direction or anything... -- gwern

Daniel Fischer
Well, if it doesn't implement the full standard, perhaps it should rather be called UVNABNQHC (Utrecht very nearly, almost but not quite Haskell compiler)?
Ha! Haskell™! I said it first, and rule that... I don't care what you use the name for. Everyone is free to make a fool of themselves if they stick it on their language, be it PHP or H98-n+k. -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

On 24 Apr 2009, at 2:13 am, John A. De Goes wrote:
Let's turn this around. You invest 4 months of your life coming out with your own experimental Haskell compiler designed to easily test new language features. Then a bunch of ungrateful wretches on Haskell Cafe demand that you stop distributing your compiler until you have full support for Haskell 98. :-)
Do you think that's fair?
No. I for one haven't done that. I've thanked them for it; I've downloaded it to learn from; I just can't use it. (It's not just n+k that's not implemented yet.)

Unfortunately I think 4 man years is definitely below the minimum of the guesses I would get if I would ask the people in my group ;-} Doaitse On 23 apr 2009, at 16:13, John A. De Goes wrote:
Let's turn this around. You invest 4 months of your life coming out with your own experimental Haskell compiler designed to easily test new language features. Then a bunch of ungrateful wretches on Haskell Cafe demand that you stop distributing your compiler until you have full support for Haskell 98. :-)
Do you think that's fair?
Regards,
John A. De Goes N-BRAIN, Inc. The Evolution of Collaboration
http://www.n-brain.net | 877-376-2724 x 101
On Apr 23, 2009, at 3:18 AM, Jon Fairbairn wrote:
"John A. De Goes"
writes: That's absurd. You have no way to access private source code, so any decision on what features to exclude from future versions of Haskell must necessarily look at publicly accessible source code.
This is all entirely beside the point. The question is not whether n+k patterns should be in the language, it's whether an implementation of Haskell 98 should include them.
The only alternative is to continuously add, and never remove, features from Haskell, even if no one (that we know) uses them.
But we can remove them in future language versions. The point I was trying to make at the beginning of this subthread was that implementations should follow the definition, because having a core language (Haskell 98) that can be relied on is simpler and wastes less time than the alternative.
-- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

"Richard O'Keefe" et all wrote:
[n+k patterns]
I'd like to add my two cents: Assuming that UHC's roadmap strives to be H'-compilant in the future, and n+k patterns aren't going to be in H', why bother implementing them? Also, assuming that current H98 code will be ported to H', shouldn't n+k patterns be removed from existing code, anyway? -- (c) this sig last receiving data processing entity. Inspect headers for copyright history. All rights reserved. Copying, hiring, renting, performance and/or quoting of this signature prohibited.

On 23 Apr 2009, at 2:24 am, Achim Schneider wrote:
"Richard O'Keefe" et all wrote:
[n+k patterns]
I'd like to add my two cents: Assuming that UHC's roadmap strives to be H'-compilant in the future, and n+k patterns aren't going to be in H', why bother implementing them?
Haskell' is a moving target. There are as yet no programs that are *known* to be Haskell' programs. There are as yet no Haskell' textbooks. (I remember someone once wrote a book about Prolog that presented Prolog according to the then-current draft of the standard. The next year the draft lurched back towards then-current practice, and the book was left describing a language that was never implemented.) If you want to support existing code written by people trained on the existing textbooks, you support as much of the existing language as you can. Otherwise you have to rely on people writing code specifically for your compiler.
Also, assuming that current H98 code will be ported to H', shouldn't n+k patterns be removed from existing code, anyway?
Someone who really believed that would surely be recommending that a compiler accept the feature and generate working code for it but WARN at each occurrence with a warning that can't be switched off. That would be a useful way to help people remove this clarity from their code. No, the UHC people can do what they please. It's _their_ compiler. It's _great_ that there's another almost-Haskell98 compiler. It's a little puzzling that section 3 "Language extensions and differences with Haskell98" says nothing whatsoever about n+k patterns. It's only in section 4.1, where we learn also that 'default' isn't there and might never be. We don't yet know whether 'default' will be in Haskell'. We also learn from http://www.cs.uu.nl/wiki/bin/view/Ehc/EhcUserDocumentation that IO is "under construction", Directory, Time, Locale, CPUTime, and Random are "not available". This is clearly work in progress, and we can only be thankful for something that's intended to be read as well as written. In fact, I've downloaded it, precisely in order to see how much work it will be to add n+k support. The thing is that it really seems bizarre to see this one feature singled out for non-implementation. If I can do the equivalent of n+k patterns by programming in the *type system*, why *not* in a pattern?

Richard O'Keefe wrote:
The thing is that it really seems bizarre to see this one feature singled out for non-implementation.
If I can do the equivalent of n+k patterns by programming in the *type system*, why *not* in a pattern?
Do you mean by something like the following? data Z = Z data S n = S n type Plus2 a = S (S a) minus2 :: Plus2 a -> a minus2 _ = undefined -- or actually use the values, or whatever If so, I'd say that n+k patterns go well beyond this kind of "pattern aliases", particularly since they operate on arbitrary Nums, not just an inductively defined natural number type. Cheers, Ganesh =============================================================================== Please access the attached hyperlink for an important electronic communications disclaimer: http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html ===============================================================================

I agree in principle; you should really implement the full Haskell98
if you claim to be a Haskell implementation.
In the particular case of n+k I don't care, since I never use them and
they are slated for removal in Hakell'.
-- Lennart
On Mon, Apr 20, 2009 at 11:59 AM, Jon Fairbairn
Achim Schneider
writes: Jon Fairbairn
wrote: atze@cs.uu.nl writes:
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC). UHC supports almost all Haskell98 features
Why? Is there something about Haskell 98 that's hard to implement?
Insanity. I doubt anyone is going to miss n+k patterns:
That (taken with the followup from Richard O'Keefe saying he does use them) underlines my point, really. What follows is specific to Haskell, but the general point applies to most languages I've encountered.
I have no love for n+k patterns, but they are part of Haskell98 -- and were the subject of protracted arguments for and against them before the Report was finished (I was against them, if I remember correctly). Any implementation claiming to be of Haskell98 should have them, whether or not the implementor likes them, because otherwise someone will come along with a valid Haskell98 programme and it won't compile, so they'll have to hack it around. This sort of thing (and resulting #ifdef all over the place) wastes far more programmer time in the end (assuming the compiler becomes popular) than it would take to implement the feature.
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago.
-- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Hello Jon, Monday, April 20, 2009, 1:59:07 PM, you wrote:
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago.
if you really believe in that you said, you can spend your own time adding its support :) i never seen n+k patterns in real code so i understand developers that don't want to waste time just to compliant standard even if their efforts will be never really used -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

If every implementor got to choose what subset of the standard to
implement that all code would have have to written in the implemented
intersection. I think that's a terrible idea.
The Haskell98 standard was set so there would be a baseline that
people could rely on.
When I implemented Haskell (both times) there were odds and ends that
I really hated (some of those feelings have changed), but I did it
anyway.
-- Lennart
On Mon, Apr 20, 2009 at 1:02 PM, Bulat Ziganshin
Hello Jon,
Monday, April 20, 2009, 1:59:07 PM, you wrote:
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago.
if you really believe in that you said, you can spend your own time adding its support :) i never seen n+k patterns in real code so i understand developers that don't want to waste time just to compliant standard even if their efforts will be never really used
-- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Well, the problem is that every implementor does choose a subset of standart to implement. It's much worse in JavaScript - essential features working differently in Internet Explorer, Firefox, Opera, and Safari, and sometimes they even differ between versions; Web programmers still manage. (n+k)-patterns are nothing compared to that. Lennart Augustsson wrote on 20.04.2009 15:17:
If every implementor got to choose what subset of the standard to implement that all code would have have to written in the implemented intersection. I think that's a terrible idea. The Haskell98 standard was set so there would be a baseline that people could rely on.
When I implemented Haskell (both times) there were odds and ends that I really hated (some of those feelings have changed), but I did it anyway.
-- Lennart
On Mon, Apr 20, 2009 at 1:02 PM, Bulat Ziganshin
wrote: Hello Jon,
Monday, April 20, 2009, 1:59:07 PM, you wrote:
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago. if you really believe in that you said, you can spend your own time adding its support :) i never seen n+k patterns in real code so i understand developers that don't want to waste time just to compliant standard even if their efforts will be never really used
-- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

I don't think that other languages failing should be an excuse for
Haskell to be equally bad.
On Mon, Apr 20, 2009 at 1:23 PM, Miguel Mitrofanov
Well, the problem is that every implementor does choose a subset of standart to implement.
It's much worse in JavaScript - essential features working differently in Internet Explorer, Firefox, Opera, and Safari, and sometimes they even differ between versions; Web programmers still manage. (n+k)-patterns are nothing compared to that.
Lennart Augustsson wrote on 20.04.2009 15:17:
If every implementor got to choose what subset of the standard to implement that all code would have have to written in the implemented intersection. I think that's a terrible idea. The Haskell98 standard was set so there would be a baseline that people could rely on.
When I implemented Haskell (both times) there were odds and ends that I really hated (some of those feelings have changed), but I did it anyway.
-- Lennart
On Mon, Apr 20, 2009 at 1:02 PM, Bulat Ziganshin
wrote: Hello Jon,
Monday, April 20, 2009, 1:59:07 PM, you wrote:
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago.
if you really believe in that you said, you can spend your own time adding its support :) i never seen n+k patterns in real code so i understand developers that don't want to waste time just to compliant standard even if their efforts will be never really used
-- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Me neither; there were actually two points: 1) It's not bad; at least, it's not bad for reasons you provide. 2) It would be here whether we like it or not. Lennart Augustsson wrote on 20.04.2009 15:31:
I don't think that other languages failing should be an excuse for Haskell to be equally bad.
On Mon, Apr 20, 2009 at 1:23 PM, Miguel Mitrofanov
wrote: Well, the problem is that every implementor does choose a subset of standart to implement.
It's much worse in JavaScript - essential features working differently in Internet Explorer, Firefox, Opera, and Safari, and sometimes they even differ between versions; Web programmers still manage. (n+k)-patterns are nothing compared to that.
Lennart Augustsson wrote on 20.04.2009 15:17:
If every implementor got to choose what subset of the standard to implement that all code would have have to written in the implemented intersection. I think that's a terrible idea. The Haskell98 standard was set so there would be a baseline that people could rely on.
When I implemented Haskell (both times) there were odds and ends that I really hated (some of those feelings have changed), but I did it anyway.
-- Lennart
On Mon, Apr 20, 2009 at 1:02 PM, Bulat Ziganshin
wrote: Hello Jon,
Monday, April 20, 2009, 1:59:07 PM, you wrote:
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago. if you really believe in that you said, you can spend your own time adding its support :) i never seen n+k patterns in real code so i understand developers that don't want to waste time just to compliant standard even if their efforts will be never really used
-- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Just refuse to use UHC until it conforms. One can refuse to use GHC
libraries that use extensions as well for similar reasons. I always think
twice when I see something that isn't Haskell 98 in my stack.
Anything that doesn't conform completely to Haskell 98 can effectively be
considered not Haskell 98 at all (all or nothing mentality), if you want to
be really strict.
The fact is we have a choice... I won't tell people not to implement things
in a way I don't like, I'll just look at it and decide whether I care to use
it or not.
As a result, UHC is not something I care to use, though I'm sure it's
interesting for those who are using it.
If I cared enough, and I don't, and the UHC sources are licensed in a way
permitting so, I could make a Haskell 98 conforming version of it, and fork
it myself.
Dave
On Mon, Apr 20, 2009 at 4:31 AM, Lennart Augustsson
I don't think that other languages failing should be an excuse for Haskell to be equally bad.
On Mon, Apr 20, 2009 at 1:23 PM, Miguel Mitrofanov
wrote: Well, the problem is that every implementor does choose a subset of standart to implement.
It's much worse in JavaScript - essential features working differently in Internet Explorer, Firefox, Opera, and Safari, and sometimes they even differ between versions; Web programmers still manage. (n+k)-patterns are nothing compared to that.
Lennart Augustsson wrote on 20.04.2009 15:17:
If every implementor got to choose what subset of the standard to implement that all code would have have to written in the implemented intersection. I think that's a terrible idea. The Haskell98 standard was set so there would be a baseline that people could rely on.
When I implemented Haskell (both times) there were odds and ends that I really hated (some of those feelings have changed), but I did it anyway.
-- Lennart
On Mon, Apr 20, 2009 at 1:02 PM, Bulat Ziganshin
wrote: Hello Jon,
Monday, April 20, 2009, 1:59:07 PM, you wrote:
It's not an implementor's place to make such decisions -- they can legitimately say "this feature sucks" and tell the next Haskell committee so. If they care enough about it, they can lobby or get on that next committee, but the arguments for n+k patterns /in Haskell98/ were done long ago.
if you really believe in that you said, you can spend your own time adding its support :) i never seen n+k patterns in real code so i understand developers that don't want to waste time just to compliant standard even if their efforts will be never really used
-- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

David Leimbach wrote:
Just refuse to use UHC until it conforms. One can refuse to use GHC libraries that use extensions as well for similar reasons. I always think twice when I see something that isn't Haskell 98 in my stack.
Do you not use Hugs for the same reason? http://cvs.haskell.org/Hugs/pages/users_guide/haskell98.html#BUGS-HASKELL98 Martijn.

Just refuse to use UHC until it conforms. Do you not use Hugs for the same reason?
Not to mention that GHC does not comply with the H'98 standard either: http://www.haskell.org/ghc/docs/latest/html/users_guide/bugs-and-infelicitie... Regards, Malcolm

On Mon, Apr 20, 2009 at 8:38 AM, Malcolm Wallace < Malcolm.Wallace@cs.york.ac.uk> wrote:
Just refuse to use UHC until it conforms. Do you not use Hugs for the same reason?
Not to mention that GHC does not comply with the H'98 standard either:
http://www.haskell.org/ghc/docs/latest/html/users_guide/bugs-and-infelicitie...
Regards, Malcolm
It's still a matter of choice. So we're saying there are no implementations of Haskell 98? Sounds like the same problem C++ and C99 have. Dave
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On Apr 20, 2009, at 5:44 PM, David Leimbach wrote:
On Mon, Apr 20, 2009 at 8:38 AM, Malcolm Wallace
wrote:
Just refuse to use UHC until it conforms. Do you not use Hugs for the same reason?
Not to mention that GHC does not comply with the H'98 standard either:
http://www.haskell.org/ghc/docs/latest/html/users_guide/bugs-and-infelicitie...
Regards, Malcolm
It's still a matter of choice. So we're saying there are no implementations of Haskell 98? Sounds like the same problem C++ and C99 have.
Dave
Why is this such a problem? You can still write and compile very beautiful programs with both GHC and UHC. And keep in mind that the UHC compiler is probably not designed to replace GHC and to be fully compliant with the Haskell 98 standard. It still has a very useful place in education and is certainly worth looking at. It is intensively making use of the attribute grammar system to perform traversals over the different internally used languages. This is a very different approach from what GHC does which makes it very interesting from an educational and scientific point of view. I encourage you to take a look inside, it is reasonably easy to grasp what going on inside of UHC. -- Sebastiaan

On Mon, Apr 20, 2009 at 8:19 AM, Martijn van Steenbergen < martijn@van.steenbergen.nl> wrote:
David Leimbach wrote:
Just refuse to use UHC until it conforms. One can refuse to use GHC libraries that use extensions as well for similar reasons. I always think twice when I see something that isn't Haskell 98 in my stack.
Do you not use Hugs for the same reason?
http://cvs.haskell.org/Hugs/pages/users_guide/haskell98.html#BUGS-HASKELL98
Martijn.
I never use hugs... the only time I've ever run hugs, is because it was available for Plan 9.

On Apr 20, 2009, at 10:46 , David Leimbach wrote:
Just refuse to use UHC until it conforms. One can refuse to use GHC libraries that use extensions as well for similar reasons. I always think twice when I see something that isn't Haskell 98 in my stack.
So you don't use hierarchical libraries? For that matter, *GHC* isn't fully Haskell98: http://www.haskell.org/ghc/docs/latest/html/users_guide/bugs-and-infelicitie... -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

"Brandon S. Allbery KF8NH"
On Apr 20, 2009, at 10:46 , David Leimbach wrote:
Just refuse to use UHC until it conforms. One can refuse to use GHC libraries that use extensions as well for similar reasons. I always think twice when I see something that isn't Haskell 98 in my stack.
So you don't use hierarchical libraries?
For that matter, *GHC* isn't fully Haskell98: http://www.haskell.org/ghc/docs /latest/html/users_guide/bugs-and-infelicities.html#haskell98-divergence
It's about compatibility. An implementation can diverge from the standard, but it should be able to parse the standard code correctly. It is totally okay to add something to the standard, but not remove something from the standard. Because by adding something, you wouldn't break the standard compliant code. But removing anything from standard would prevent any standard code from working correctly. The only thing that would make a standard code break in ghc is
let x = 42 in x == 42 == True
But I believe it is a much smaller issue compared to what UHC has. I believe the standard should be maintained as the intersect of all the valid implementations. If any implementation does not implement all of the standard, according to what should our coders write? -- c/* __o/* <\ * (__ */\ <

Miguel Mitrofanov
Well, the problem is that every implementor does choose a subset of standart to implement.
That's what I'm complaining about.
It's much worse in JavaScript - essential features working differently in Internet Explorer, Firefox, Opera, and Safari, and sometimes they even differ between versions; Web programmers still manage.
Strange example to choose. Have you any idea how much time is wasted because of the implementation differences in JavaScript?
(n+k)-patterns are nothing compared to that.
Since there is no need for /any/ differences in the implemented part of H98, we can, if we choose, have /the/ language where all this crap about "I'd better not use this part of the standard because the MuckWorx compiler doesn't implement it" doesn't apply. This thread is really depressing (this is about the rest of the thread, not your post Miguel). We're rehearsing the arguments about n+k patterns that were gone through by the first Haskell committee and then rehashed by the H98 folks, when that's completely irrelevant -- n+k patterns are in H98, so implementors should implement them. It's arrogant and disrespectful on the part of the implementors to say that they know better than the committee what features should be part of the language. I'm reminded of decades ago when people talked about implementing "extended subsets" of this or that language. Perl is an extended subset of Haskell... Concerning the suggestion that I should implement them, given that I'm against n+k patterns, I hardly think the effort should fall on me -- I'm not in the business of implementing Haskell at all at the moment. Or maybe I should be more pro-active. Here, using my favoured paradigm of "Advanced Reactive Software Engineering" (a successor to extreme programming and the like) is my Haskell 98 compiler (currently only implements a subset): module Main where main = error ("You have used an unimplemented feature of Haskell 98.\n\ \Please submit a test case and patch to correct the deficiency\n") -- Jón Fairbairn Jon.Fairbairn@cl.cam.ac.uk http://www.chaos.org.uk/~jf/Stuff-I-dont-want.html (updated 2009-01-31)

On 22 Apr 2009, at 13:07, Jon Fairbairn wrote:
Miguel Mitrofanov
writes: Well, the problem is that every implementor does choose a subset of standart to implement.
That's what I'm complaining about.
And that's exactly what you (or anybody else) can't do anything about (thank God for that).
It's much worse in JavaScript - essential features working differently in Internet Explorer, Firefox, Opera, and Safari, and sometimes they even differ between versions; Web programmers still manage.
Strange example to choose. Have you any idea how much time is wasted because of the implementation differences in JavaScript?
Actually, I have a pretty good idea - I've used to be a Web programmer. Surprisingly, not much.
It's arrogant and disrespectful on the part of the implementors to say that they know better than the committee what features should be part of the language.
It's arrogant and disrespectful on the part of the committee to say that they know better than the implementors what features should they implement.

2009/04/22 Miguel Mitrofanov
It's arrogant and disrespectful on the part of the implementors to say that they know better than the committee what features should be part of the language.
It's arrogant and disrespectful on the part of the committee to say that they know better than the implementors what features should they implement.
So what is the committee there for? To approve helpful suggestions? -- Jason Dusek

On 22 Apr 2009, at 21:19, Jason Dusek wrote:
2009/04/22 Miguel Mitrofanov
: It's arrogant and disrespectful on the part of the implementors to say that they know better than the committee what features should be part of the language.
It's arrogant and disrespectful on the part of the committee to say that they know better than the implementors what features should they implement.
So what is the committee there for? To approve helpful suggestions?
To give advice(s). Like, for example, W3C.

On Wed, Apr 22, 2009 at 10:19 AM, Jason Dusek
2009/04/22 Miguel Mitrofanov
: It's arrogant and disrespectful on the part of the implementors to say that they know better than the committee what features should be part of the language.
It's arrogant and disrespectful on the part of the committee to say that they know better than the implementors what features should they implement.
So what is the committee there for? To approve helpful suggestions?
The fun of sharing code is that you get to deal with the peanut gallery... Oh right, that's why I don't share.
-- Jason Dusek _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

atze@cs.uu.nl schrieb:
Utrecht Haskell Compiler -- first release, version 1.0.0 ========================================================
The UHC team is happy to announce the first public release of the Utrecht Haskell Compiler (UHC).
Great to see another haskell compiler !
Features:
* Experimental language extensions, some of which have not been implemented before. Maybe you could add a section 'Differences to GHC 6.10' to the manual ?
From a quick look, partial type signatures and local instances seem to be 'novel features', and there are some differences with respect to type / kind inference. Section 4.2 (Proposed in haskell prime but not in UHC) and Section 4.3 (Available in UHC but not in Haskell98 or Haskell prime) are a little bit confusing (and imho superfluous): - Many language extensions have been 'proposed' for haskell prime (http://hackage.haskell.org/trac/haskell-prime/wiki/Status). - MPTCs are accepted for haskell prime - 'Existential Quantification' is accepted for haskell prime, but different from 'Existential Types' described in the manual - Functional Dependencies are not available in UHC ;) best regards, benedikt
participants (34)
-
Achim Schneider
-
Alexander Dunlap
-
Antoine Latter
-
atze@cs.uu.nl
-
Benedikt Huber
-
Brandon S. Allbery KF8NH
-
Bulat Ziganshin
-
Claus Reinke
-
Colin Paul Adams
-
Daniel Fischer
-
David Leimbach
-
david48
-
Don Stewart
-
Duncan Coutts
-
Edward Middleton
-
Gwern Branwen
-
Jason Dagit
-
Jason Dusek
-
John A. De Goes
-
Jon Fairbairn
-
Jules Bean
-
Lennart Augustsson
-
Malcolm Wallace
-
Martijn van Steenbergen
-
Miguel Mitrofanov
-
Richard O'Keefe
-
Ross Paterson
-
S. Doaitse Swierstra
-
Sebastiaan Visser
-
Sittampalam, Ganesh
-
Stefan Holdermans
-
Thomas Davie
-
wren ng thornton
-
Xiao-Yong Jin