
Browsing around Hackage, I notice that a seemingly random subset of packages are available for something called "arch linux". Presumably some sort of automatic conversion system is involved, but does anyone know why only certain packages appear? I've noticed that both Debian and OpenSUSE have a very tiny selection of binary Haskell packages too. I'm guessing that these packages are also auto-generated, but presumably selected by hand. (I also don't recall seeing them listed on Hackage.) Anybody know about that? In general, is there an advantage to having native packages for Haskell things? I guess it means you can have binary packages, so you don't need to build from source. And for executables, it means the native package manager can track all the dependencies and install them all for you, potentially without needing a Haskell build environment at all. Is that it, or have I missed something?

Andrew Coppin
Browsing around Hackage, I notice that a seemingly random subset of packages are available for something called "arch linux". Presumably some sort of automatic conversion system is involved, but does anyone know why only certain packages appear?
I've noticed that both Debian and OpenSUSE have a very tiny selection of binary Haskell packages too. I'm guessing that these packages are also auto-generated, but presumably selected by hand. (I also don't recall seeing them listed on Hackage.) Anybody know about that?
In general, is there an advantage to having native packages for Haskell things? I guess it means you can have binary packages, so you don't need to build from source. And for executables, it means the native package manager can track all the dependencies and install them all for you, potentially without needing a Haskell build environment at all. Is that it, or have I missed something?
Hackage has limited support for distro maintainers to state which packages are available on the distribution. Last I checked, it required distro maintainers to keep a text file somewhere up to date. Note that not all distributions bother (in particular none of us involved with packaging Haskell packages for Gentoo can be bothered; we're slowly cutting back into only keeping packages that will actually be used rather than all and sundry), and even those that do might just list what's in the official repository (I think arch does this). Even then, Don Steward has a policy of packaging all and sundry for Arch (at least in the unofficial repository; this includes packages such as haskell-updater that are written for Gentoo). As for why using your distro package manager for Haskell packages is preferable: http://ivanmiljenovic.wordpress.com/2010/03/15/repeat-after-me-cabal-is-not-... -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

Ivan Lazar Miljenovic wrote:
Hackage has limited support for distro maintainers to state which packages are available on the distribution. Last I checked, it required distro maintainers to keep a text file somewhere up to date.
Note that not all distributions bother.
Yeah, I figured. I don't see any Debian or OpenSUSE anywhere, and I know they do have at least a few pre-built binary packages out there. It looks as if it's automated for Arch, however. Either that or somebody is spending an absurd amount of time keeping it manually up to date.
(in particular none of us involved with packaging Haskell packages for Gentoo can be bothered; we're slowly cutting back into only keeping packages that will actually be used rather than all and sundry)
Well, I guess you either manually select which packages to convert, or you have an automated system convert everything in sight. This whole observation came about because I noticed that some (but not all) of my own packages have ended up on Arch, despite being of almost no use to anybody. I was just curious as to how that happened.
As for why using your distro package manager for Haskell packages is preferable: http://ivanmiljenovic.wordpress.com/2010/03/15/repeat-after-me-cabal-is-not-...
Right. So Cabal isn't a package manager because it only manages Haskell packages? Not sure I agree with that definition. (It also has a laundry list of problems that can and should be fixed, but won't be.) I actually spent quite a while trying to figure out what the purpose of Cabal *is*. It's not like it's hard to download a bunch of Haskell source code and utter "ghc --make Foo". So why do we even need Cabal in the first place? The answer, as far as I can tell, is that registering a library manually is so excruciatingly hard that we actually need a tool to automate the process. (Obviously when I first started using Haskell, I was mainly interested in writing runnable programs, not libraries.) Cabal can also run Haddock for you, which is quite hard. But it wasn't until cabal-install came along that I even realised that Cabal could track and resolve dependencies. (The fact that it doesn't track installed executables is news to me.) If nothing else, I think that "what Cabal is" should be documented much more clearly. It took me a hell of a long time to figure this out. Now, you say it's preferable to use the native package manager where possible. I've got one word for you: Windows. You know, the most popular OS on the market? The one installed on 98% of all computers world-wide? Guess what: no native package manager. Actually, we have tools that automatically convert Cabal packages to Debian packages or RPMs or whatever. I think there could be some milage in a tool that builds Windows installers. (The problem, of course, is that you have to be able to *build* the library on Windows first!) You would of course then have all kinds of fun and games with dependency tracking...

Andrew Coppin
Ivan Lazar Miljenovic wrote:
Hackage has limited support for distro maintainers to state which packages are available on the distribution. Last I checked, it required distro maintainers to keep a text file somewhere up to date.
Note that not all distributions bother.
Yeah, I figured. I don't see any Debian or OpenSUSE anywhere, and I know they do have at least a few pre-built binary packages out there.
It looks as if it's automated for Arch, however. Either that or somebody is spending an absurd amount of time keeping it manually up to date.
Don has probably written a script to poll which packages are available and write them to the required file.
(in particular none of us involved with packaging Haskell packages for Gentoo can be bothered; we're slowly cutting back into only keeping packages that will actually be used rather than all and sundry)
Well, I guess you either manually select which packages to convert, or you have an automated system convert everything in sight.
This whole observation came about because I noticed that some (but not all) of my own packages have ended up on Arch, despite being of almost no use to anybody. I was just curious as to how that happened.
The "convert everything in sight" approach.
As for why using your distro package manager for Haskell packages is preferable: http://ivanmiljenovic.wordpress.com/2010/03/15/repeat-after-me-cabal-is-not-...
Right. So Cabal isn't a package manager because it only manages Haskell packages? Not sure I agree with that definition. (It also has a laundry list of problems that can and should be fixed, but won't be.)
Well, Cabal is just a build library; cabal-install automates downloading and building of Haskell packages but not fully, can't uninstall, can't update, etc. Part of this is that it has bugs, can be improved, etc. (uninstallation, etc.). However, I don't think a per-language build-tool like cabal-install, rubygems, etc. can really be called a "package manager".
I actually spent quite a while trying to figure out what the purpose of Cabal *is*. It's not like it's hard to download a bunch of Haskell source code and utter "ghc --make Foo". So why do we even need Cabal in the first place? The answer, as far as I can tell, is that registering a library manually is so excruciatingly hard that we actually need a tool to automate the process. (Obviously when I first started using Haskell, I was mainly interested in writing runnable programs, not libraries.) Cabal can also run Haddock for you, which is quite hard. But it wasn't until cabal-install came along that I even realised that Cabal could track and resolve dependencies. (The fact that it doesn't track installed executables is news to me.)
Well, Cabal (more specifically the actual .cabal file) tells you a few things: 1) Metadata on the library (homepage, description, version, etc.). 2) A list of dependencies that the package needs; this is especially important when considering something like mtl vs monads-{fd,tf}: which library did you grab Control.Monad.State from? 3) Specification of available library and executables. 4) Available modules for libraries, as well as those not actually visible externally. 5) The ability to create tarballs for distribution (whilst VCS tools like darcs can also do this, you sometimes keep some files in version control that shouldn't be shipped with the release tarball). 6) Flags to change the behaviour and what to build, and the ability to do some auto-detection to be able to have different build options for Windows, etc. All in all, Cabal serves two different aims: * It provides metadata on what a Haskell package is. * It's a very simplistic build system for Haskell packages.
If nothing else, I think that "what Cabal is" should be documented much more clearly. It took me a hell of a long time to figure this out.
Now, you say it's preferable to use the native package manager where possible. I've got one word for you: Windows. You know, the most popular OS on the market? The one installed on 98% of all computers world-wide? Guess what: no native package manager.
Right, and that's when tools like cabal-install can come in handy (but I believe there is a new-ish attempt at writing a package manager for Windows over mingw).
Actually, we have tools that automatically convert Cabal packages to Debian packages or RPMs or whatever. I think there could be some milage in a tool that builds Windows installers. (The problem, of course, is that you have to be able to *build* the library on Windows first!) You would of course then have all kinds of fun and games with dependency tracking...
Right, but again note the distinction: Cabal /= cabal-install; most people call "cabal-install" a package manager; the distinction is analogous to (but not quite the same as) RPM/deb vs yum/apt in Fedora/Debian. -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

Now, you say it's preferable to use the native package manager where possible. I've got one word for you: Windows. You know, the most popular OS on the market? The one installed on 98% of all computers world-wide? Guess what: no native package manager.
Isn't Windows Installer (MSI) a package manager?
/J
On 22 August 2010 12:41, Andrew Coppin
Ivan Lazar Miljenovic wrote:
Hackage has limited support for distro maintainers to state which packages are available on the distribution. Last I checked, it required distro maintainers to keep a text file somewhere up to date.
Note that not all distributions bother.
Yeah, I figured. I don't see any Debian or OpenSUSE anywhere, and I know they do have at least a few pre-built binary packages out there.
It looks as if it's automated for Arch, however. Either that or somebody is spending an absurd amount of time keeping it manually up to date.
(in particular none of us involved with packaging Haskell packages for Gentoo can be bothered; we're slowly cutting back into only keeping packages that will actually be used rather than all and sundry)
Well, I guess you either manually select which packages to convert, or you have an automated system convert everything in sight.
This whole observation came about because I noticed that some (but not all) of my own packages have ended up on Arch, despite being of almost no use to anybody. I was just curious as to how that happened.
As for why using your distro package manager for Haskell packages is preferable:
http://ivanmiljenovic.wordpress.com/2010/03/15/repeat-after-me-cabal-is-not-...
Right. So Cabal isn't a package manager because it only manages Haskell packages? Not sure I agree with that definition. (It also has a laundry list of problems that can and should be fixed, but won't be.)
I actually spent quite a while trying to figure out what the purpose of Cabal *is*. It's not like it's hard to download a bunch of Haskell source code and utter "ghc --make Foo". So why do we even need Cabal in the first place? The answer, as far as I can tell, is that registering a library manually is so excruciatingly hard that we actually need a tool to automate the process. (Obviously when I first started using Haskell, I was mainly interested in writing runnable programs, not libraries.) Cabal can also run Haddock for you, which is quite hard. But it wasn't until cabal-install came along that I even realised that Cabal could track and resolve dependencies. (The fact that it doesn't track installed executables is news to me.)
If nothing else, I think that "what Cabal is" should be documented much more clearly. It took me a hell of a long time to figure this out.
Now, you say it's preferable to use the native package manager where possible. I've got one word for you: Windows. You know, the most popular OS on the market? The one installed on 98% of all computers world-wide? Guess what: no native package manager.
Actually, we have tools that automatically convert Cabal packages to Debian packages or RPMs or whatever. I think there could be some milage in a tool that builds Windows installers. (The problem, of course, is that you have to be able to *build* the library on Windows first!) You would of course then have all kinds of fun and games with dependency tracking...
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Jonas Almström Duregård
Now, you say it's preferable to use the native package manager where possible. I've got one word for you: Windows. You know, the most popular OS on the market? The one installed on 98% of all computers world-wide? Guess what: no native package manager.
Isn't Windows Installer (MSI) a package manager?
Can you say "please download and install darcs for me and take care of any dependencies it might need whilst you're at it" with MSI? MSI is more analogous to RPM/deb packages than to yum/apt. -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 08/22/2010 07:19 AM, Jonas Almström Duregård wrote:
Now, you say it's preferable to use the native package manager where possible. I've got one word for you: Windows. You know, the most popular OS on the market? The one installed on 98% of all computers world-wide? Guess what: no native package manager.
Isn't Windows Installer (MSI) a package manager?
No, the Windows and OSX installers are just that. They provide no facilities for finding packages, identifying the package a given file came from, or dependency tracking. OSX's installer doesn't even have uninstall support; it records the installed files, but provides no mechanism for undoing configuration changes such as removing package-installed kernel modules. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkxxTmoACgkQIn7hlCsL25VNsQCfZKJIz+KONa4yWAI97QYyttGU ITMAnjoAhcj3kMrWxnsSxWb5jraFBu1r =ERlI -----END PGP SIGNATURE-----

It looks as if it's automated for Arch, however. Either that or somebody is spending an absurd amount of time keeping it manually up to date.
Yeah, it's automated, Don Stewart made a script to do that. http://archhaskell.wordpress.com/ -- ============================================== Ivan Sichmann Freitas Engenharia de Computação 2009 UNICAMP http://identi.ca/ivansichmann Grupo Pró Software Livre UNICAMP - GPSL ==============================================

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 08/22/2010 06:41 AM, Andrew Coppin wrote:
Ivan Lazar Miljenovic wrote:
Hackage has limited support for distro maintainers to state which packages are available on the distribution. Last I checked, it required distro maintainers to keep a text file somewhere up to date.
Note that not all distributions bother.
It looks as if it's automated for Arch, however. Either that or somebody is spending an absurd amount of time keeping it manually up to date.
Last I heard, dons had a thing he ran that converted Cabal packages into Arch Linux packages automatically. I wouldn't be surprised if he had something that automated the whole procedure from Hackage download to Arch Linux upload.
As for why using your distro package manager for Haskell packages is
preferable:
http://ivanmiljenovic.wordpress.com/2010/03/15/repeat-after-me-cabal-is-not-...
Right. So Cabal isn't a package manager because it only manages Haskell packages? Not sure I agree with that definition. (It also has a laundry list of problems that can and should be fixed, but won't be.)
So, remember your gripes elsethread about libcurl on Windows? That's what using a real package manager gets you: the non-Haskell dependencies are also handled and (assuming the packager isn't an idiot) Just Work.
I actually spent quite a while trying to figure out what the purpose of Cabal *is*. It's not like it's hard to download a bunch of Haskell source code and utter "ghc --make Foo". So why do we even need Cabal in the first place? The answer, as far as I can tell, is that registering a library manually is so excruciatingly
The answer is that it checks dependencies for you. This is a mixed blessing, however (see "cabal upgrade").
Actually, we have tools that automatically convert Cabal packages to Debian packages or RPMs or whatever. I think there could be some milage in a tool that builds Windows installers. (The problem, of course, is that you have to be able to *build* the library on Windows first!) You would of course then have all kinds of fun and games with dependency tracking...
And the big problem with Windows is an utter lack of consistency in package arrangement. Linux has the FHS; *BSD and the OSX environments (Fink and MacPorts) have mtree specifications; where did the Windows library you just installed decide to stick its files? -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkxxTNgACgkQIn7hlCsL25UK3wCeMX/OySIyO3JXiDlijLEL1lM/ uesAn2LnchpzHqnbREYbrv347llLFkN6 =Ecxk -----END PGP SIGNATURE-----

On Aug 22, 2010, at 3:41 AM, Andrew Coppin wrote:
It looks as if it's automated for Arch, however. Either that or somebody is spending an absurd amount of time keeping it manually up to date.
It probably is automated. There's a tool out there called "cabal2arch", which turns a cabal file into a PKGBUILD file. They are similar enough that translation can be done automatically, assuming there aren't any special needs. (For example, I doubt the gtk2 haskell package would work with cabal2arch)

On Wed, Aug 25, 2010 at 12:07 PM, Alexander Solla
On Aug 22, 2010, at 3:41 AM, Andrew Coppin wrote:
It looks as if it's automated for Arch, however. Either that or somebody
is spending an absurd amount of time keeping it manually up to date.
It probably is automated. There's a tool out there called "cabal2arch", which turns a cabal file into a PKGBUILD file. They are similar enough that translation can be done automatically, assuming there aren't any special needs. (For example, I doubt the gtk2 haskell package would work with cabal2arch)
It is automated in a manual way =D. By that I mean that there is a script
which autobuilds packages with cabal2arch, however that script itself has to be manually run. In all honesty I believe the best policy is to have base packages (and other 'required') be installed through the package manager of the distro and libraries installed through cabal. Haskell binaries then use cabal to check dependencies to see if the binary can be built (that is what I do now and I have no issues with it)
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On 25 August 2010 13:40, Mathew de Detrich
It is automated in a manual way =D. By that I mean that there is a script which autobuilds packages with cabal2arch, however that script itself has to be manually run. In all honesty I believe the best policy is to have base packages (and other 'required') be installed through the package manager of the distro and libraries installed through cabal. Haskell binaries then use cabal to check dependencies to see if the binary can be built (that is what I do now and I have no issues with it)
Consider these scenarios: 1) You upgrade package foo; this breaks a large number of other packages. How do you deal with it? 2) You upgrade GHC. You now have to manually re-build all packages that you had built with the previous version of GHC. 3) You want to uninstall some Haskell packages. 4) You built a package with non-standard build options; cabal-install keeps wanting to rebuild it with the defaults. 5) You don't want to wait for a package maintainer to loosen the dependencies of a package you know works with a newer version of a dependency. 6) You want to install package bar; it fails to build due to some missing C library/build tool/etc. You have to dig around and work out which system package contains that C library/Haskell package contains that build tool/etc. and install that first. Now, some future version of cabal-install may in fact solve the first four problems by automating them and keeping track of installed packages itself rather than relying on ghc-pkg. It will _never_ be able to solve the latter two (OK, it might be that someone adds functionality to add "tweaks" to a package at configure/build/etc. time). Note also that if there is some trivial failure with a package not building against a newer version of a dependency or GHC (e.g. the last monolithic release of gtk2hs built against GHC 6.12.1 but not 6.12.2), then in many cases it's easier and faster for the person maintaining the distribution's packages to apply a fix than wait for upstream to release a new version. If you think the Arch packages for Haskell are so bad/out of date, why not do something about that and help maintain them rather than just whinge? -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

On Wed, Aug 25, 2010 at 1:59 PM, Ivan Lazar Miljenovic < ivan.miljenovic@gmail.com> wrote:
On 25 August 2010 13:40, Mathew de Detrich
wrote: It is automated in a manual way =D. By that I mean that there is a script which autobuilds packages with cabal2arch, however that script itself has to be manually run. In all honesty I believe the best policy is to have base packages (and other 'required') be installed through the package manager of the distro and libraries installed through cabal. Haskell binaries then use cabal to check dependencies to see if the binary can be built (that is what I do now and I have no issues with it)
Consider these scenarios:
1) You upgrade package foo; this breaks a large number of other packages. How do you deal with it?
Thats what happened when I was using archlinux for libraries, nothing has broken with cabal. And if cabal somehow does break when installing libraries, I know where all the libraries are (in one folder) so its easy to fix (just delete folder, unregister packages) instead of having to do it through a package manager
2) You upgrade GHC. You now have to manually re-build all packages that you had built with the previous version of GHC.
Base libraries that game with GHC I left alone (as I mentioned earlier). Packages such as parallel and array are installed by GHC and left alone
3) You want to uninstall some Haskell packages.
Blame cabal for not having cabal uninstall =D. Also you can unregister the package through GHC (to check if it doesn't depend on anything) and delete the library folder (thats what uninstalling the aur packages does anyways, just instead of deleting the library folders the files get removed through the package manager)
4) You built a package with non-standard build options; cabal-install keeps wanting to rebuild it with the defaults.
Havne't had this issue yet, can you be more specific (gtk for example installed fine through cabal-install, I believe thats something thats regarded as a non standard install). Remember that there is nothing stopping me from installing archlinux aur packages if for whatever reason cabal-install doesn't work, and thats highly unlikely because AUR packages are built by using runhaskell configure/build etc in the first place
5) You don't want to wait for a package maintainer to loosen the dependencies of a package you know works with a newer version of a dependency.
This is also an issue in AUR, so AUR ain't any better
6) You want to install package bar; it fails to build due to some missing C library/build tool/etc. You have to dig around and work out which system package contains that C library/Haskell package contains that build tool/etc. and install that first.
I installed two packages that depend on C libraries (wx and gtk) without any issues (through cabal-install). cabal-install works with C libraries the same way it does when you do runhaskell configure/build
Now, some future version of cabal-install may in fact solve the first four problems by automating them and keeping track of installed packages itself rather than relying on ghc-pkg. It will _never_ be able to solve the latter two (OK, it might be that someone adds functionality to add "tweaks" to a package at configure/build/etc. time).
I never had these problems, if there is a problem with cabal-install it will be a problem with Archlinux AUR packages, because cabal-install does configure/build in its process anyways. The only exception is binaries, and as stated before I do install them through Archlinux AUR
Note also that if there is some trivial failure with a package not building against a newer version of a dependency or GHC (e.g. the last monolithic release of gtk2hs built against GHC 6.12.1 but not 6.12.2), then in many cases it's easier and faster for the person maintaining the distribution's packages to apply a fix than wait for upstream to release a new version.
If you think the Arch packages for Haskell are so bad/out of date, why not do something about that and help maintain them rather than just whinge?
-- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

On 25 August 2010 14:36, Mathew de Detrich
On Wed, Aug 25, 2010 at 1:59 PM, Ivan Lazar Miljenovic
wrote: Consider these scenarios:
1) You upgrade package foo; this breaks a large number of other packages. How do you deal with it?
Thats what happened when I was using archlinux for libraries, nothing has broken with cabal. And if cabal somehow does break when installing libraries, I know where all the libraries are (in one folder) so its easy to fix (just delete folder, unregister packages) instead of having to do it through a package manager
Ummm, why should you need to unregister packages, etc.? I'm talking about a situation where bar depends on foo-x.y.*, and you upgrade foo from x.y.1 to x.y.2 (and "ghc-pkg check" will then complain about bar being broken).
2) You upgrade GHC. You now have to manually re-build all packages that you had built with the previous version of GHC.
Base libraries that game with GHC I left alone (as I mentioned earlier). Packages such as parallel and array are installed by GHC and left alone
But all the other libraries that you built with cabal-install?
3) You want to uninstall some Haskell packages.
Blame cabal for not having cabal uninstall =D. Also you can unregister the package through GHC (to check if it doesn't depend on anything) and delete the library folder (thats what uninstalling the aur packages does anyways, just instead of deleting the library folders the files get removed through the package manager)
Well, yes, but that's my point: the package manager handles that.
4) You built a package with non-standard build options; cabal-install keeps wanting to rebuild it with the defaults.
Havne't had this issue yet, can you be more specific (gtk for example installed fine through cabal-install, I believe thats something thats regarded as a non standard install). Remember that there is nothing stopping me from installing archlinux aur packages if for whatever reason cabal-install doesn't work, and thats highly unlikely because AUR packages are built by using runhaskell configure/build etc in the first place
e.g. you want to build gtk with --flag=-have-gio ; next time you want to build anything with cabal-install that uses gtk, cabal-install will want to re-build gtk and reset that option to --flag=have-gio (the default).
5) You don't want to wait for a package maintainer to loosen the dependencies of a package you know works with a newer version of a dependency.
This is also an issue in AUR, so AUR ain't any better
Well, if you help out maintaining AUR you can then go and do said loosening yourself.
6) You want to install package bar; it fails to build due to some missing C library/build tool/etc. You have to dig around and work out which system package contains that C library/Haskell package contains that build tool/etc. and install that first.
I installed two packages that depend on C libraries (wx and gtk) without any issues (through cabal-install). cabal-install works with C libraries the same way it does when you do runhaskell configure/build
*sigh* my point is, you have to know to install the C libraries first. cabal-install can bitch that a C library isn't installed, but you then have to work out what the proper package name is and install it. Some more concrete examples: * At some stage the HEAD version of darcs had a dependency on the "icuuc" library; it took me a while to work out that the corresponding Gentoo package for this library was dev-libs/icu * I wrote and maintain SourceGraph; someone that used cabal-install to install it on a (I think) Ubuntu box emailed me to state that it came up with a lot of broken handle error messages rather than generating the report as required (I thought I had caught those exceptions; I'm still trying to work out where to catch them to have a better message). The reason? He didn't have Graphviz (as in www.graphviz.org) installed, which my graphviz library (that SourceGraph uses) uses to actually render the DotGraph values into images. This isn't caught at build-time as it's a run-time dependency (i.e. it isn't using it as a library but as a command-line tool to run). A proper distro package should have that dependency in there (e.g. http://code.haskell.org/gentoo/gentoo-haskell/dev-haskell/graphviz/graphviz-... ).
Now, some future version of cabal-install may in fact solve the first four problems by automating them and keeping track of installed packages itself rather than relying on ghc-pkg. It will _never_ be able to solve the latter two (OK, it might be that someone adds functionality to add "tweaks" to a package at configure/build/etc. time).
I never had these problems, if there is a problem with cabal-install it will be a problem with Archlinux AUR packages, because cabal-install does configure/build in its process anyways. The only exception is binaries, and as stated before I do install them through Archlinux AUR
Then you mustn't be doing much with Haskell, and despite your previous complaints you mustn't upgrade many low-level Haskell libraries often. I get broken packages all the time; that's why I gave up and wrote haskell-updater for Gentoo to automate re-building of broken packages (essentially it gets the output of "ghc-pkg check", matches the listed packages with the corresponding distro packages and then tells the package manager to rebuild them). -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

On 22/08/2010 11:41, Andrew Coppin wrote:
Ivan Lazar Miljenovic wrote:
Hackage has limited support for distro maintainers to state which packages are available on the distribution. Last I checked, it required distro maintainers to keep a text file somewhere up to date.
Note that not all distributions bother.
Yeah, I figured. I don't see any Debian or OpenSUSE anywhere, and I know they do have at least a few pre-built binary packages out there.
It looks as if it's automated for Arch, however. Either that or somebody is spending an absurd amount of time keeping it manually up to date.
(in particular none of us involved with packaging Haskell packages for Gentoo can be bothered; we're slowly cutting back into only keeping packages that will actually be used rather than all and sundry)
Well, I guess you either manually select which packages to convert, or you have an automated system convert everything in sight.
This whole observation came about because I noticed that some (but not all) of my own packages have ended up on Arch, despite being of almost no use to anybody. I was just curious as to how that happened.
As for why using your distro package manager for Haskell packages is preferable: http://ivanmiljenovic.wordpress.com/2010/03/15/repeat-after-me-cabal-is-not-...
Right. So Cabal isn't a package manager because it only manages Haskell packages? Not sure I agree with that definition. (It also has a laundry list of problems that can and should be fixed, but won't be.)
I actually spent quite a while trying to figure out what the purpose of Cabal *is*. It's not like it's hard to download a bunch of Haskell source code and utter "ghc --make Foo". So why do we even need Cabal in the first place? The answer, as far as I can tell, is that registering a library manually is so excruciatingly hard that we actually need a tool to automate the process. (Obviously when I first started using Haskell, I was mainly interested in writing runnable programs, not libraries.) Cabal can also run Haddock for you, which is quite hard. But it wasn't until cabal-install came along that I even realised that Cabal could track and resolve dependencies. (The fact that it doesn't track installed executables is news to me.)
If nothing else, I think that "what Cabal is" should be documented much more clearly. It took me a hell of a long time to figure this out.
Now, you say it's preferable to use the native package manager where possible. I've got one word for you: Windows. You know, the most popular OS on the market? The one installed on 98% of all computers world-wide? Guess what: no native package manager.
Actually, we have tools that automatically convert Cabal packages to Debian packages or RPMs or whatever. I think there could be some milage in a tool that builds Windows installers. (The problem, of course, is that you have to be able to *build* the library on Windows first!) You would of course then have all kinds of fun and games with dependency tracking...
If you look at the original Cabal design document[1], you'll see that one of the goals of Cabal was to be the glue that lets you convert an arbitrary Haskell library into a native package for a variety of systems - including MSIs on Windows. Indeed, I must admit when we were designing Cabal I thought that native packages would be the most common way that people would install Cabal packages, specifically because many systems already have a good package manager, and trying to bypass the system package manager would be a fundamental mistake. It turned out that cabal-install would be a lot more useful than I imagined, but the two systems are complementary: native packages are for installing globally, and cabal-install is for installing packages in your home directory. Even on systems without a package manager (i.e. Windows), it would make more sense when installing a package globally to build an MSI first, so that the system can track the installation and let you uninstall it later. There was a prototype Windows Installer builder for Cabal, cabal2wix [2] but I think the project is currently dormant. [1] http://www.haskell.org/cabal/proposal/ [2] http://www.haskell.org/pipermail/cabal-devel/2007-August/000740.html Cheers, Simon

marlowsd:
If you look at the original Cabal design document[1], you'll see that one of the goals of Cabal was to be the glue that lets you convert an arbitrary Haskell library into a native package for a variety of systems - including MSIs on Windows. Indeed, I must admit when we were designing Cabal I thought that native packages would be the most common way that people would install Cabal packages, specifically because many systems already have a good package manager, and trying to bypass the system package manager would be a fundamental mistake. It turned out that cabal-install would be a lot more useful than I imagined, but the two systems are complementary: native packages are for installing globally, and cabal-install is for installing packages in your home directory.
We also didn't know that Hackage would get so big, so quickly. So there's three levels of packages now: 1. absolutely vital: HP (now on every system) 2. native packaging of useful Haskell apps and libs (many on Debian, Arch, Gentoo, few elsewhere) 3. cabal-install: everything else, works everywhere. And it looks like many distros are learning towards just providing 1. natively. Those with more automation (Debian, Arch) do 2. as well, though it is less useful than we thought now that cabal-install is relatively stable. A new trend are tools like 'bauerbill' on Arch, which has a --hackage flag, that converts hackage to native packages on the fly. That's like teaching apt to grok hackage. It's interesting how its all sorting out. -- Don

On Thu, Aug 26, 2010 at 5:02 AM, Don Stewart
marlowsd:
If you look at the original Cabal design document[1], you'll see that one of the goals of Cabal was to be the glue that lets you convert an arbitrary Haskell library into a native package for a variety of systems - including MSIs on Windows. Indeed, I must admit when we were designing Cabal I thought that native packages would be the most common way that people would install Cabal packages, specifically because many systems already have a good package manager, and trying to bypass the system package manager would be a fundamental mistake. It turned out that cabal-install would be a lot more useful than I imagined, but the two systems are complementary: native packages are for installing globally, and cabal-install is for installing packages in your home directory.
We also didn't know that Hackage would get so big, so quickly. So there's three levels of packages now:
1. absolutely vital: HP (now on every system) 2. native packaging of useful Haskell apps and libs (many on Debian, Arch, Gentoo, few elsewhere) 3. cabal-install: everything else, works everywhere.
And it looks like many distros are learning towards just providing 1. natively. Those with more automation (Debian, Arch) do 2. as well, though it is less useful than we thought now that cabal-install is relatively stable.
A new trend are tools like 'bauerbill' on Arch, which has a --hackage flag, that converts hackage to native packages on the fly. That's like teaching apt to grok hackage.
It's interesting how its all sorting out.
-- Don _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Library packages are only interesting to me when it comes to application deployment. Things are greatly simplified when you DO NOT have shared libraries to contend with because the only person tracking dependencies is the developer. Go, for example, has no shared libraries, and the runtime fits in every binary. It does not even depend on libc. Go binaries call the system call interface of the kernel, and the net result is that I get to test my go code, deploy it, and not worry about the state of deployed go environments quite so much as I do in the presence of shared libraries. As such I think cabal-install is excellent in that it installs in the developer's home directory, because that's all I need in other environments as well. It's quite practical. People are obsessed with shared library support but I can not for the life of me figure out why. Dave

David Leimbach wrote:
It's quite practical. People are obsessed with shared library support but I can not for the life of me figure out why.
Maybe because a simple Hello World program in Haskell becomes about 2MB when compiled? (The equivilent C program ends up being 15KB or something, which is just a tad smaller.)

It does make a difference in certain cases. For a 2MB binary to be trivial
it assumes that (1) you are in a developed country (2) you are using a
landline internet connection and not going through your cell-phone company,
although this gap is closing fast.
I feel this India whenever I visit India. Most people buy into a data
option through their cell phones and the available bandwidth is about an
order of magnitude *below* DSL. So that's one case where 15KB vs. 2MB is a
big deal.
-deech
On Thu, Aug 26, 2010 at 10:50 AM, Andrew Coppin wrote: David Leimbach wrote: It's quite practical. People are obsessed with shared library support but
I can not for the life of me figure out why. Maybe because a simple Hello World program in Haskell becomes about 2MB
when compiled? (The equivilent C program ends up being 15KB or something,
which is just a tad smaller.) _______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 8/26/10 10:23 , David Leimbach wrote:
Go, for example, has no shared libraries, and the runtime fits in every binary. It does not even depend on libc. Go binaries call the system call interface of the kernel, and the net result is that I get to test my go code, deploy it, and not worry about the state of deployed go environments quite so much as I do in the presence of shared libraries.
Um. That's a really good way to have all your programs stop working when the Linux kernel interface changes yet again ("ABIs? We don't need no steenking ABIs!" --- see in /usr/src/linux/Documentation). Solaris is similar; the only approved interface is via libc and you must link to it shared if you want your program to work across versions/releases. (Which is the reason shared library support is important. I personally like my programs to keep working.) - -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkx2rmMACgkQIn7hlCsL25XmjACgmWne8rR0EYeeHIBZvp2gywLp KDMAoKLGedOj5Dy2GXBo+NLcbCGemhS8 =XBmS -----END PGP SIGNATURE-----

On Thu, Aug 26, 2010 at 11:11 AM, Brandon S Allbery KF8NH < allbery@ece.cmu.edu> wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 8/26/10 10:23 , David Leimbach wrote:
Go, for example, has no shared libraries, and the runtime fits in every binary. It does not even depend on libc. Go binaries call the system call interface of the kernel, and the net result is that I get to test my go code, deploy it, and not worry about the state of deployed go environments quite so much as I do in the presence of shared libraries.
Um. That's a really good way to have all your programs stop working when the Linux kernel interface changes yet again ("ABIs? We don't need no steenking ABIs!" --- see in /usr/src/linux/Documentation). Solaris is similar; the only approved interface is via libc and you must link to it shared if you want your program to work across versions/releases.
(Which is the reason shared library support is important. I personally like my programs to keep working.)
So you have to keep the runtime as up to date as glibc? Sounds ok :-). Also, I don't know anyone that supports people updating kernels in linux in any sort of commercial setting for the very reason you just gave. Sounds like asking for trouble. In my experience, a kernel upgrade is taken pretty seriously, and not done without very good reason. Look at CentOS, it's on a pretty old kernel most of the time, because people in enterprise situations prefer stability over bleeding edge features. Dave
- -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iEYEARECAAYFAkx2rmMACgkQIn7hlCsL25XmjACgmWne8rR0EYeeHIBZvp2gywLp KDMAoKLGedOj5Dy2GXBo+NLcbCGemhS8 =XBmS -----END PGP SIGNATURE-----

On Aug 27, 2010, at 6:11 AM, Brandon S Allbery KF8NH wrote:
Um. That's a really good way to have all your programs stop working when the Linux kernel interface changes yet again ("ABIs? We don't need no steenking ABIs!" --- see in /usr/src/linux/Documentation). Solaris is similar; the only approved interface is via libc and you must link to it shared if you want your program to work across versions/releases.
Yeah, right. So here I am running "SunOS 5.10 Generic January 2005" and the version/release has changed incompatibly just *how* often? That's on a SPARC. On the Mac I have OpenSolaris, and guess how often new releases have broken things on that? This little helpfulness from Sun (can't blame Oracle for this one, much as I'd like to) broke profiling and has never ever gained me personally anything. If there _were_ a new release I'd have to rebuild everything anyway, A *P* I changes have been more common than A *B* I ones. Maybe Linux is different. One thing is NOT different, and that is Linux upgrades *DO* reliably break programs that use dynamic linking. Dynamic libraries get - left out - changed incompatibly - moved some place else - changed compatibly but the version number altered so the dynamic linker doesn't believe it, or the foolsXXXXXXkind people who built the program wired in a demand for a particular version Indeed, every Linux upgrade I've had I've found myself screaming in frustration because programs *weren't* statically linked.
(Which is the reason shared library support is important. I personally like my programs to keep working.)
- -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iEYEARECAAYFAkx2rmMACgkQIn7hlCsL25XmjACgmWne8rR0EYeeHIBZvp2gywLp KDMAoKLGedOj5Dy2GXBo+NLcbCGemhS8 =XBmS -----END PGP SIGNATURE----- _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On Thu, Aug 26, 2010 at 20:51, Richard O'Keefe
Maybe Linux is different. One thing is NOT different, and that is Linux upgrades *DO* reliably break programs that use dynamic linking. Dynamic libraries get - left out - changed incompatibly - moved some place else - changed compatibly but the version number altered so the dynamic linker doesn't believe it, or the foolsXXXXXXkind people who built the program wired in a demand for a particular version Indeed, every Linux upgrade I've had I've found myself screaming in frustration because programs *weren't* statically linked.
Upgrading Linux should never, ever cause applications to stop working unless they were designed incorrectly in the first place. Low-level system libraries like glibc are the only code which needs to access Linux directly. However, most of the problems you mentioned (removed/modified dynamic libraries) are not part of Linux at all. If your distribution has poor quality control, you should consider switching to a better one -- I've heard good news about both Debian and RHEL in this area. Desktop-oriented distributions, such as Ubuntu or Fedora, are not suitable for long-term (> 6 years or so) installations. Haskell, of course, takes ABI pickiness to an absolute maximum. One of my most wished-for features is a way to provide C-style stable ABIs for Haskell shared libraries, so I could (for example) upgrade a support library and have every installed application pick it up.

On Aug 27, 2010, at 4:52 PM, John Millikin wrote:
On Thu, Aug 26, 2010 at 20:51, I wrote:
Maybe Linux is different. One thing is NOT different, and that is Linux upgrades *DO* reliably break programs that use dynamic linking.
Upgrading Linux should never, ever cause applications to stop working unless they were designed incorrectly in the first place.
"Linux" in this context means the system as perceived by users, a "distribution", not just a kernel.
Desktop-oriented distributions, such as Ubuntu or Fedora, are not suitable for long-term (> 6 years or so) installations.
Fedora is one of the Linux distributions I was talking about.

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 8/26/10 23:51 , Richard O'Keefe wrote:
Indeed, every Linux upgrade I've had I've found myself screaming in frustration because programs *weren't* statically linked.
RH/Fedora? We ditched RH completely after I found myself repeatedly regenerating RH packages because of broken updates; and let's not even talk about up*grades*. My life has been much happier since I banished RH from both home and work environments. I had something similar happen with one Ubuntu upgrade, but to Canonical's credit they learn from their mistakes; current RH/Fedora users tell me that updates are still screwed up, especially if you don't immediately upgrade to a new release as soon as it shows up (the security backports are invariably messed up in annoying ways). As for your Solaris experience, the only programs I've seen break are ones that broke the rules (usually deliberately because someone "knew better"). We've a fair amount of code not updated since 2.5.1.... - -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkx3aM0ACgkQIn7hlCsL25V+GACgrP2wBWOR1rJEpbMT4uI5hDMu K6cAoJKvrb/tKdMfwZ/IRzLkSozx4HFy =+Mz6 -----END PGP SIGNATURE-----

Brandon S Allbery KF8NH wrote:
On 8/26/10 10:23 , David Leimbach wrote:
Go, for example, has no shared libraries, and the runtime fits in every binary. It does not even depend on libc. Go binaries call the system call interface of the kernel, and the net result is that I get to test my go code, deploy it, and not worry about the state of deployed go environments quite so much as I do in the presence of shared libraries.
Um. That's a really good way to have all your programs stop working when the Linux kernel interface changes yet again ("ABIs? We don't need no steenking ABIs!" --- see in /usr/src/linux/Documentation). Solaris is similar; the only approved interface is via libc and you must link to it shared if you want your program to work across versions/releases.
If you don't mind, I'd like a proper reference for this; looking at the Linux kernel documentation as you suggest tells me that the kernelspace to userspace ABI is supposed to be 100% stable, such that I can take all the binaries (including shared libraries) from an i386 Linux 2.0 system, and run them in a chroot on my x86-64 Linux 2.6.35 system. It's the in-kernel ABI (for loadable kernel modules and the like) that's not guaranteed to remain stable. -- Simon

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 8/27/10 05:58 , Simon Farnsworth wrote:
If you don't mind, I'd like a proper reference for this; looking at the Linux kernel documentation as you suggest tells me that the kernelspace to userspace ABI is supposed to be 100% stable, such that I can take all the binaries (including shared libraries) from an i386 Linux 2.0 system, and run them in a chroot on my x86-64 Linux 2.6.35 system.
Maybe it's "supposed" to be, but even with more recent stuff (like, say, binary GHC releases --- which use glibc shared even if Haskell libs aren't) I quite often see programs fail to run because the kernel changed something and the kernel/userspace interface changed as a result. A written policy is worthless if it isn't followed. - -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkx4YbcACgkQIn7hlCsL25UXhACgrteurjouZAdrdj4+yzsXGLJd fOoAn0L73V7CYA5yfsiLfaBLsJLVI7l+ =d6UD -----END PGP SIGNATURE-----

On 28 August 2010 11:09, Brandon S Allbery KF8NH
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 8/27/10 05:58 , Simon Farnsworth wrote:
If you don't mind, I'd like a proper reference for this; looking at the Linux kernel documentation as you suggest tells me that the kernelspace to userspace ABI is supposed to be 100% stable, such that I can take all the binaries (including shared libraries) from an i386 Linux 2.0 system, and run them in a chroot on my x86-64 Linux 2.6.35 system.
Maybe it's "supposed" to be, but even with more recent stuff (like, say, binary GHC releases --- which use glibc shared even if Haskell libs aren't) I quite often see programs fail to run because the kernel changed something and the kernel/userspace interface changed as a result. A written policy is worthless if it isn't followed.
Well, I have no need to recompile glibc and packages that depend upon it every time I update my kernel... So maybe glibc changes, but not the kernel AFAICT. -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

On 28/08/10 02:15, Ivan Lazar Miljenovic wrote:
On 28 August 2010 11:09, Brandon S Allbery KF8NH
wrote: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 8/27/10 05:58 , Simon Farnsworth wrote:
If you don't mind, I'd like a proper reference for this; looking at the Linux kernel documentation as you suggest tells me that the kernelspace to userspace ABI is supposed to be 100% stable, such that I can take all the binaries (including shared libraries) from an i386 Linux 2.0 system, and run them in a chroot on my x86-64 Linux 2.6.35 system.
Maybe it's "supposed" to be, but even with more recent stuff (like, say, binary GHC releases --- which use glibc shared even if Haskell libs aren't) I quite often see programs fail to run because the kernel changed something and the kernel/userspace interface changed as a result. A written policy is worthless if it isn't followed.
Well, I have no need to recompile glibc and packages that depend upon it every time I update my kernel... So maybe glibc changes, but not the kernel AFAICT.
I've been following this part of the discussion with some interest. Mainly because I've been a Linux user since kernel version 1.2, and I've *never* had any of the problems people mention here. So I'm wondering, what are you doing to your systems? /M -- Magnus Therning (OpenPGP: 0xAB4DFBA4) magnus@therning.org Jabber: magnus@therning.org http://therning.org/magnus identi.ca|twitter: magthe

Simon Marlow wrote:
If you look at the original Cabal design document[1], you'll see that one of the goals of Cabal was to be the glue that lets you convert an arbitrary Haskell library into a native package for a variety of systems - including MSIs on Windows. Indeed, I must admit when we were designing Cabal I thought that native packages would be the most common way that people would install Cabal packages, specifically because many systems already have a good package manager, and trying to bypass the system package manager would be a fundamental mistake. It turned out that cabal-install would be a lot more useful than I imagined, but the two systems are complementary: native packages are for installing globally, and cabal-install is for installing packages in your home directory.
Why would you ever want to install a package per-user? I mean, if you don't have permission to do a global install, then you also don't have permission to install GHC in the first place so...? Indeed, the *only* plausible reason I can think of is if you're trying to build something that has unusual package version constraints, and you want to build it without upsetting the entire system.
Even on systems without a package manager (i.e. Windows), it would make more sense when installing a package globally to build an MSI first, so that the system can track the installation and let you uninstall it later.
I did have a look at building a binary installer using Nullsoft NSIS. Unfortunately, I don't know of any tool in existence that can build MSI files that isn't absurdly expensive. (E.g., InstallShield is ~£4,000, which is extortionate for a program that just copies files around. Even BackupExec isn't *that* expensive, and that's mission-critical!) Of course, *I* was looking at NSIS specifically for installing Haskell-to-C bindings. These are virtually impossible to build on Windows, and I figured if I could build such a package once, I could then make a binary installer out of it and never again have to build it from source. (Until the next GHC version, anyway.) But I utterly failed to make the building part work, so I never got to the next bit. If you were to use binary installers for regular Haskell packages, the only real benefit would be that you can now UNinstall them again. It might be worth doing that, and it looks plausible that you could automate it...
Interesting. So Cabal was never intended to handle executables at all. (The entire proposal speaks only about *libraries*.) Suddenly several of Cabal's deficiencies make a lot more sense. It doesn't handle executables properly because it was never designed to. It doesn't uninstall because Cabal packages are supposed to be converted into real packages first, and real package managers provide uninstall capabilities. And so on. It's slightly disturbing how the proposal meantions "make" every three sentences. You realise that make only exists under Unix, right? There _are_ other operating systems out there... I also can't for the life of me work out why something *designed for* automatic processing is designed without machine-readable syntax. Even in this early proposal, Cabal is already using that horrid ad hoc undocumented file format that only Cabal itself can actually parse and understand. Why not XML or JSON or *something* with a formal spec and a wide range of available tools? It makes no sense at all. And in case somebody is sitting there thinking "It IS documented. It's simple, isn't it?", did you know that file paths have to be escaped like Haskell string literals? No, I bet you didn't. Where is this fact documented? It isn't. Why was this decided? I'm guessing it's an implementation accident rather than a deliberate decision. Now if this were XML or JSON, everybody would already *know* the escaping rules. And we'd have tools that know about these rules and can handle processing such files. People seem to think that Cabal's existing format makes it easier for humans to read and write, but personally I'm always left wondering exactly which constructions are or aren't permitted. Can I put several values on a line here, or do they have to be seperate lines? Do all the field values have to be indented by the same amount? How does Cabal figure out which fields are subfields anyway? In summary, I would be *so* much happier if we had a real file format rather than this ugly home-grown thing. Unfortunately, this would break everything on Hackage, so it will never be fixed.
[2] http://www.haskell.org/pipermail/cabal-devel/2007-August/000740.html
Also interesting. I've never heard of WIX before...

On 27 August 2010 19:13, Andrew Coppin
Simon Marlow wrote:
If you look at the original Cabal design document[1], you'll see that one of the goals of Cabal was to be the glue that lets you convert an arbitrary Haskell library into a native package for a variety of systems - including MSIs on Windows. Indeed, I must admit when we were designing Cabal I thought that native packages would be the most common way that people would install Cabal packages, specifically because many systems already have a good package manager, and trying to bypass the system package manager would be a fundamental mistake. It turned out that cabal-install would be a lot more useful than I imagined, but the two systems are complementary: native packages are for installing globally, and cabal-install is for installing packages in your home directory.
Why would you ever want to install a package per-user? I mean, if you don't have permission to do a global install, then you also don't have permission to install GHC in the first place so...? Indeed, the *only* plausible reason I can think of is if you're trying to build something that has unusual package version constraints, and you want to build it without upsetting the entire system.
Well, what happens if you have a university account where GHC is installed on the machines (and it's actually recent enough, something my uni doesn't have so I install GHC into my home directory) or other multi-user environments? You may wish to use extra packages than what are available by default.
If you were to use binary installers for regular Haskell packages, the only real benefit would be that you can now UNinstall them again. It might be worth doing that, and it looks plausible that you could automate it...
Whilst this isn't applicable to Windows, in systems with a proper package manager, you get dependencies brought in for you as well.
Interesting. So Cabal was never intended to handle executables at all. (The entire proposal speaks only about *libraries*.) Suddenly several of Cabal's deficiencies make a lot more sense. It doesn't handle executables properly because it was never designed to. It doesn't uninstall because Cabal packages are supposed to be converted into real packages first, and real package managers provide uninstall capabilities. And so on.
What do you think the "Applications" bit in the definition of Cabal is? (Disclaimer: I haven't read the original proposal).
It's slightly disturbing how the proposal meantions "make" every three sentences. You realise that make only exists under Unix, right? There _are_ other operating systems out there...
And GHC was designed with a POSIX-style environment in mind. And realistically, Windows is the only major non-Posix like OS nowadays. Furthermore, GHC was aimed primarily at teaching and research, and from my (admittedly limited) experience the IT/CS departments at unis tend to run Unix/Linux.
I also can't for the life of me work out why something *designed for* automatic processing is designed without machine-readable syntax. Even in this early proposal, Cabal is already using that horrid ad hoc undocumented file format that only Cabal itself can actually parse and understand. Why not XML or JSON or *something* with a formal spec and a wide range of available tools? It makes no sense at all.
Because I'd like to read what a package is about from its .cabal file: Cabal isn't just a build system specification, it also provides medata on the project in question.
And in case somebody is sitting there thinking "It IS documented. It's simple, isn't it?", did you know that file paths have to be escaped like Haskell string literals? No, I bet you didn't. Where is this fact documented? It isn't. Why was this decided? I'm guessing it's an implementation accident rather than a deliberate decision.
Do you mean in the description field, etc.? That's because it uses Haddock for that to put it on Hackage (admittedly yes, that isn't documented and I got caught out by that as well; however that only seems to matter if memory serves when displaying the formatted output on Hackage). Otherwise, if you mean actual file paths when specifying extra files, etc. then that's because it uses Unix-style paths.
Now if this were XML or JSON, everybody would already *know* the escaping rules.
Except for people that have never used XML or JSON... don't we count? (OK, I lie, I have used XML, and I'm trying very hard to forget it.)
And we'd have tools that know about these rules and can handle processing such files. People seem to think that Cabal's existing format makes it easier for humans to read and write, but personally I'm always left wondering exactly which constructions are or aren't permitted. Can I put several values on a line here, or do they have to be seperate lines? Do all the field values have to be indented by the same amount? How does Cabal figure out which fields are subfields anyway?
So the problem here is a matter of under-specification (or possibly lack of tool support - including editor modes - for ensuring you're doing it correctly).
In summary, I would be *so* much happier if we had a real file format rather than this ugly home-grown thing. Unfortunately, this would break everything on Hackage, so it will never be fixed.
... except this "home grown thing" _is_ a file format. -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

Ivan Lazar Miljenovic wrote:
On 27 August 2010 19:13, Andrew Coppin
wrote: Why would you ever want to install a package per-user? I mean, if you don't have permission to do a global install, then you also don't have permission to install GHC in the first place so...?
Well, what happens if you have a university account where GHC is installed on the machines (and it's actually recent enough, something my uni doesn't have so I install GHC into my home directory) or other multi-user environments? You may wish to use extra packages than what are available by default.
Mmm, I suppose...
If you were to use binary installers for regular Haskell packages, the only real benefit would be that you can now UNinstall them again. It might be worth doing that, and it looks plausible that you could automate it...
Whilst this isn't applicable to Windows, in systems with a proper package manager, you get dependencies brought in for you as well.
Windows has more package management facilities than most people realise. For example, go install Office 2007. In fact, just install Excel 2007, not the whole thing. Windows Installer can automatically figure out that you *do* need to install the Spell Checker (since Excel uses that), but you do *not* need to install the Grammar Checker (since only Word and PowerPoint use that, and you haven't selected to install those). Not only does it decide what to install, but you can query it programatically to find out what got installed in the end. And where it got installed. And what version was installed. And you can check what version of X is installed and do I need to update it with the new version of X that I'm packaged with? And is component Y installed? And which programs depend on that? Can it be uninstalled if nobody's using it now? And... About the only thing it won't do is automatically grab stuff from a central repository. Because, in the Windows world, most software costs money. But it really *does* to quite a lot more than people realise. (E.g., you can take the installer for Foo-1.0, put the patch file for Foo-1.1 next to it, and install Foo-1.1 all in one go, patching the original installer on-the-fly as you run it. Ditto for site-specific customisations. You can "repair" installed programs, checking whether shortcuts still exist, libraries are registered, library versions are new enough, file checksums match, etc. And so on.)
Interesting. So Cabal was never intended to handle executables at all. (The entire proposal speaks only about *libraries*.) Suddenly several of Cabal's deficiencies make a lot more sense. It doesn't handle executables properly because it was never designed to.
What do you think the "Applications" bit in the definition of Cabal is? (Disclaimer: I haven't read the original proposal).
I'm presuming it was added later, as an afterthought. And that's why there's something there, but it doesn't work fantastically well. (E.g., Cabal doesn't "know" which binaries are installed, even though it installs them. It just wasn't part of the design.)
It's slightly disturbing how the proposal meantions "make" every three sentences. You realise that make only exists under Unix, right? There _are_ other operating systems out there...
And GHC was designed with a POSIX-style environment in mind. And realistically, Windows is the only major non-Posix like OS nowadays.
Furthermore, GHC was aimed primarily at teaching and research, and from my (admittedly limited) experience the IT/CS departments at unis tend to run Unix/Linux.
Depends on what kind of establishment you go to, I guess. My uni was 100% Windows. Only a few servers were Unix (notably the web server). I guess it depends on whether you think your students are going into datacenter support (probably Unix) or desktop support or application development (obviously all desktops are Windows).
Why not XML or JSON or *something* with a formal spec and a wide range of available tools? It makes no sense at all.
Because I'd like to read what a package is about from its .cabal file:
And, to me, having a real, documented file format would make reading it a lot easier. (And *writing* it would become miles easier!) I'm never quite sure where one thing ends and another begins. Explicit delimiters would help here.
And in case somebody is sitting there thinking "It IS documented. It's simple, isn't it?", did you know that file paths have to be escaped like Haskell string literals? No, I bet you didn't. Where is this fact documented? It isn't. Why was this decided? I'm guessing it's an implementation accident rather than a deliberate decision.
Do you mean in the description field, etc.? That's because it uses Haddock for that to put it on Hackage (admittedly yes, that isn't documented and I got caught out by that as well; however that only seems to matter if memory serves when displaying the formatted output on Hackage).
I especially love the way that none of Haddock's formatting commands seem to work in the Cabal description field, even though everybody keeps telling me "it's formatted with Haddock". Most especially, bullet lists will not work, no matter what I do, and it's really, really annoying me... (Haddock is another irritation. It's formatting commands are seemingly random and ad hoc. Put something in quotes, and it happily generates a link to a non-existent module, without even bothering to check whether it exists. Nice...)
Otherwise, if you mean actual file paths when specifying extra files, etc. then that's because it uses Unix-style paths.
Unix-style paths are all very well, but if you need to tell Cabal "hey, the headers are in C:\Program Files\Headers", then you end up needing to type C:\\Program Files\\Headers. Which is unecessary (there's no *reason* why it should need escaping, it's just that Cabal is designed that way), but I could live with it if it were documented somewhere. (The fact that you need to twiddle with an existing Cabal package description is a whole other kettle of fish, of course...)
Now if this were XML or JSON, everybody would already *know* the escaping rules.
Except for people that have never used XML or JSON... don't we count?
(OK, I lie, I have used XML, and I'm trying very hard to forget it.)
If you're only trying to _read_ the file, both of these are pretty self-explanatory. You only need to worry about the technical details if you try to _write_ them. And there are resources across the face of the Internet explaining in minute detail everything you could possibly want to know. For Cabal's home-brew file format, you've got... the terse notes in the Cabal documentation. And that's it.
So the problem here is a matter of under-specification
Well, the under-specification *is* a problem. I guess for me the main problem though is that it's just _ugly_.
(or possibly lack of tool support - including editor modes - for ensuring you're doing it correctly).
You know, not everybody uses (or wants to use) Emacs. Other editors exist. Besides, I always thought that if you _need_ a special editor to edit something, it's not designed very well.
In summary, I would be *so* much happier if we had a real file format rather than this ugly home-grown thing. Unfortunately, this would break everything on Hackage, so it will never be fixed.
... except this "home grown thing" _is_ a file format.
Yeah. Just not a very nice one. And one that's supported by only one tool in the entire world.

On 27 August 2010 20:13, Andrew Coppin
Ivan Lazar Miljenovic wrote:
On 27 August 2010 19:13, Andrew Coppin
wrote: If you were to use binary installers for regular Haskell packages, the only real benefit would be that you can now UNinstall them again. It might be worth doing that, and it looks plausible that you could automate it...
Whilst this isn't applicable to Windows, in systems with a proper package manager, you get dependencies brought in for you as well.
Windows has more package management facilities than most people realise.
For example, go install Office 2007. In fact, just install Excel 2007, not the whole thing. Windows Installer can automatically figure out that you *do* need to install the Spell Checker (since Excel uses that), but you do *not* need to install the Grammar Checker (since only Word and PowerPoint use that, and you haven't selected to install those). Not only does it decide what to install, but you can query it programatically to find out what got installed in the end. And where it got installed. And what version was installed. And you can check what version of X is installed and do I need to update it with the new version of X that I'm packaged with? And is component Y installed? And which programs depend on that? Can it be uninstalled if nobody's using it now? And...
But that's one specific installer; not a generic package management system (in terms of the extra sub-dependencies).
About the only thing it won't do is automatically grab stuff from a central repository. Because, in the Windows world, most software costs money. But it really *does* to quite a lot more than people realise. (E.g., you can take the installer for Foo-1.0, put the patch file for Foo-1.1 next to it, and install Foo-1.1 all in one go, patching the original installer on-the-fly as you run it. Ditto for site-specific customisations. You can "repair" installed programs, checking whether shortcuts still exist, libraries are registered, library versions are new enough, file checksums match, etc. And so on.)
I've never seen this Foo-1.1 behaviour, unless it's a specific patch-level installer that uses the same data. Note also that this isn't automatic: you have to explicitly download Foo-1.1 yourself, etc. So, to be more specific, we can state that Windows has a form of package management, without an actual package management _system_ such as typically found in Linux distributions.
Interesting. So Cabal was never intended to handle executables at all. (The entire proposal speaks only about *libraries*.) Suddenly several of Cabal's deficiencies make a lot more sense. It doesn't handle executables properly because it was never designed to.
What do you think the "Applications" bit in the definition of Cabal is? (Disclaimer: I haven't read the original proposal).
I'm presuming it was added later, as an afterthought. And that's why there's something there, but it doesn't work fantastically well. (E.g., Cabal doesn't "know" which binaries are installed, even though it installs them. It just wasn't part of the design.)
Ummm, Cabal is a combination of a build system and metadata specification for packages; it isn't a package management system or even a package manager (hence my previous link to my blog post). As such, it isn't designed to keep track of installed applications, and the only reason it knows which libraries are installed is because ghc-pkg can tell it so it knows which dependencies are already present without needing to install them again.
It's slightly disturbing how the proposal meantions "make" every three sentences. You realise that make only exists under Unix, right? There _are_ other operating systems out there...
And GHC was designed with a POSIX-style environment in mind. And realistically, Windows is the only major non-Posix like OS nowadays.
Furthermore, GHC was aimed primarily at teaching and research, and from my (admittedly limited) experience the IT/CS departments at unis tend to run Unix/Linux.
Depends on what kind of establishment you go to, I guess. My uni was 100% Windows. Only a few servers were Unix (notably the web server). I guess it depends on whether you think your students are going into datacenter support (probably Unix) or desktop support or application development (obviously all desktops are Windows).
Oh really? And yes, the unis I've been at have overall had Windows-based desktops everywhere (with the occasional Macs, especially in biology), but the IT/CS departments had Linux/Unix machines (at UQ, desktops were primarily Windows but with an outdated Solaris server which people remotely connected to; at ANU the CS department has Ubuntu everywhere, even for student machines). This is also true in some non-IT/CS departments (at UQ, the math department provided the option of installing Fedora on the academic's machines, but not many took up that offer and preferred to ssh in to the dedicated machines; then again, UQ had just switched to using an Exchange-based system for email, calendaring, etc. ...).
Why not XML or JSON or *something* with a formal spec and a wide range of available tools? It makes no sense at all.
Because I'd like to read what a package is about from its .cabal file:
And, to me, having a real, documented file format would make reading it a lot easier. (And *writing* it would become miles easier!) I'm never quite sure where one thing ends and another begins. Explicit delimiters would help here.
Sure; this is a matter of under-specification of the file format. I would point out what happened with X configuration recently: they went from an ini-style configuration file that was relatively human readable and editable (especially if you were basing your config off of a guide) to an XML-based one when they switched to HAL-based device management (since XML is easier for programs to read and write) and back to the original format because the XML-based format was a disaster. I personally find a file format such as Cabal's much easier to read and write than one that requires me to put a whole bunch of angled brackets in everywhere...
And in case somebody is sitting there thinking "It IS documented. It's simple, isn't it?", did you know that file paths have to be escaped like Haskell string literals? No, I bet you didn't. Where is this fact documented? It isn't. Why was this decided? I'm guessing it's an implementation accident rather than a deliberate decision.
Do you mean in the description field, etc.? That's because it uses Haddock for that to put it on Hackage (admittedly yes, that isn't documented and I got caught out by that as well; however that only seems to matter if memory serves when displaying the formatted output on Hackage).
I especially love the way that none of Haddock's formatting commands seem to work in the Cabal description field, even though everybody keeps telling me "it's formatted with Haddock". Most especially, bullet lists will not work, no matter what I do, and it's really, really annoying me...
I've done bullet lists; it requires a slight change to the usual: http://hackage.haskell.org/packages/archive/graphviz/2999.10.0.1/graphviz.ca... (note the `.'s in between lines).
(Haddock is another irritation. It's formatting commands are seemingly random and ad hoc. Put something in quotes, and it happily generates a link to a non-existent module, without even bothering to check whether it exists. Nice...)
Oh, I definitely agree with you that Haddock's markup leaves a lot to be desired.
Otherwise, if you mean actual file paths when specifying extra files, etc. then that's because it uses Unix-style paths.
Unix-style paths are all very well, but if you need to tell Cabal "hey, the headers are in C:\Program Files\Headers", then you end up needing to type C:\\Program Files\\Headers. Which is unecessary (there's no *reason* why it should need escaping, it's just that Cabal is designed that way), but I could live with it if it were documented somewhere.
(The fact that you need to twiddle with an existing Cabal package description is a whole other kettle of fish, of course...)
Huh, I was under the impression that you could just use unix-style file paths in a relative fashion with Cabal even on Windows...
Now if this were XML or JSON, everybody would already *know* the escaping rules.
Except for people that have never used XML or JSON... don't we count?
(OK, I lie, I have used XML, and I'm trying very hard to forget it.)
If you're only trying to _read_ the file, both of these are pretty self-explanatory. You only need to worry about the technical details if you try to _write_ them. And there are resources across the face of the Internet explaining in minute detail everything you could possibly want to know. For Cabal's home-brew file format, you've got... the terse notes in the Cabal documentation. And that's it.
And looking up other .cabal files... ;-) Then again, even if we used XML or JSON we'd still have to look up what the write tag-names, etc. are.
So the problem here is a matter of under-specification
Well, the under-specification *is* a problem. I guess for me the main problem though is that it's just _ugly_.
(or possibly lack of tool support - including editor modes - for ensuring you're doing it correctly).
You know, not everybody uses (or wants to use) Emacs. Other editors exist. Besides, I always thought that if you _need_ a special editor to edit something, it's not designed very well.
Did I say Emacs? Don't other editors/IDEs/etc. have the ability to syntax-highlight various files, generate code, etc.? And whatever editor you use to edit Haskell code with, whilst you could write Haskell code in Notepad, etc. isn't it easier using an editor with at least syntax highlighting?
In summary, I would be *so* much happier if we had a real file format rather than this ugly home-grown thing. Unfortunately, this would break everything on Hackage, so it will never be fixed.
... except this "home grown thing" _is_ a file format.
Yeah. Just not a very nice one. And one that's supported by only one tool in the entire world.
Well, which other tools need to support it? And from a brief bit of Googling, Ruby Gems seems to use its own (YAML-based) file format... Now, Duncan et. al. are working on Cabal-2; it's quite possible that they're taking complaints like this into account, but I would much prefer to keep something like the current format (but with better specifications) than one using XML or JSON. -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

Ivan Lazar Miljenovic wrote:
On 27 August 2010 20:13, Andrew Coppin
wrote: Windows has more package management facilities than most people realise.
But that's one specific installer; not a generic package management system (in terms of the extra sub-dependencies).
Sure. Windows doesn't give you the "apt-get install foo" and it just works thing. As I say, on Windows, most software requires purchase, so that model isn't going to work too well. (Yes, you and I know there's plenty of free software for Windows, but that's not what the designers had in mind when they designed it. Windows predates widespread OSS by a decade or two.)
I've never seen this Foo-1.1 behaviour, unless it's a specific patch-level installer that uses the same data. Note also that this isn't automatic: you have to explicitly download Foo-1.1 yourself, etc.
It's nice in that you can just download a small patch file rather than redownload the whole of Foo-1.1. But I must admit, I've never seen anybody actually offer this in practise. So I guess it's one of those things that Windows supports but nobody really uses much.
So, to be more specific, we can state that Windows has a form of package management, without an actual package management _system_ such as typically found in Linux distributions.
As I say, it doesn't support the "install foo now" thing that Linux usually gives you. But on Windows, "install foo" usually being going and buying it first. Note that what you *can* do, in a corporate environment, is make it so that every time a user from group X logs in, applications A, B and C get automatically installed. Or all the machines in group Y have them installed. Or you can "advertise" applications, which (as I understand it) puts an icon on the desktop, and the first time you click it, it installs the application and then runs it (and subsequent times, obviously, it just runs it). And you can make it so that certain security or compatibility updates get auto-installed, and all kinds of stuff which I haven't seen on Linux. (But then, I don't use Linux in a corporate environmnet, and I do use Windows in such. It wouldn't surprise me if RedHat or SuSE offer something like this, for example.)
Ummm, Cabal is a combination of a build system and metadata specification for packages; it isn't a package management system or even a package manager (hence my previous link to my blog post).
Yeah. That becomes clearer once you read the original design goals, rather than look at what it eventually morphed into becomming.
Furthermore, GHC was aimed primarily at teaching and research, and from my (admittedly limited) experience the IT/CS departments at unis tend to run Unix/Linux.
Depends on what kind of establishment you go to, I guess.
Oh really?
Yeah, there are places where everything is Windows, and other places where everything is Unix. I've heard rumours that some people even use the Apple Mac. I guess it depends where you go...
I would point out what happened with X configuration recently: they went from an ini-style configuration file that was relatively human readable and editable (especially if you were basing your config off of a guide) to an XML-based one when they switched to HAL-based device management (since XML is easier for programs to read and write) and back to the original format because the XML-based format was a disaster.
Doesn't necessarily prove anything. It's much like saying "a know a guy who switched from a diesel to a petrol car, and it was a disaster, therefore we should ban all petrol cars". Your argument does not follow.
I personally find a file format such as Cabal's much easier to read and write than one that requires me to put a whole bunch of angled brackets in everywhere...
I find (X)HTML just fine to read. MathML, on the other hand, is a disaster. When you use HTML properly, what you get is plain text with the occasional markup interjection. But with MathML, 80% of the text is stuff that doesn't even show up on screen, and it utterly obscures the meaning of what you're trying to write. MathML is basically pre-tokenised data; every single damned token becomes another XML element with a bunch of attributes. It makes it drop-dead easy for machines to work with, and almost impossible for humans to comprehend or edit. For example, in MathML, "2+2" becomes <math xmlns="http://www.w3.org/1998/Math/Mathml"><mrow><mn>2</mn><mo>+</mo><mn>2</mn></mrow></math> which is pretty absurd. Somewhere in there are the 3 characters of "information", the rest is all metadata. What all this proves is that XML can be horrid, or it can be just fine. Personally, I would have no problem with writing <Name>foo</Name> <Version>1.0</Version> <Synopsis>This does stuff.</Synopsis> Now I don't have to worry about whitespace; XML has rules for all that. And I don't have to worry about escaping or character sets; XML has rules for that too. And if that's too hard to swallow, how about JSON? { "Name": "foo", "Version": 1.0, "Synopsis": "This does stuff." } Again, I can now lay this out any way I want, and it's really pretty easy to read. About the only unfortunate feature is that the key names have to be quoted for some reason.
I especially love the way that none of Haddock's formatting commands seem to work in the Cabal description field, even though everybody keeps telling me "it's formatted with Haddock". Most especially, bullet lists will not work, no matter what I do, and it's really, really annoying me...
I've done bullet lists; it requires a slight change to the usual: http://hackage.haskell.org/packages/archive/graphviz/2999.10.0.1/graphviz.ca... (note the `.'s in between lines).
Curly braces and dots? I don't see that documented anywhere. Presumably this is due to the stupid insistence on using whitespace to delimit things. If it were XML or JSON, you wouldn't need such silliness.
(Haddock is another irritation. It's formatting commands are seemingly random and ad hoc. Put something in quotes, and it happily generates a link to a non-existent module, without even bothering to check whether it exists. Nice...)
Oh, I definitely agree with you that Haddock's markup leaves a lot to be desired.
The output too. But hey, I guess when I've written somebody better myself *then* I get to criticise...
Otherwise, if you mean actual file paths when specifying extra files, etc. then that's because it uses Unix-style paths.
Unix-style paths are all very well, but if you need to tell Cabal "hey, the headers are in C:\Program Files\Headers", then you end up needing to type C:\\Program Files\\Headers. Which is unecessary (there's no *reason* why it should need escaping, it's just that Cabal is designed that way), but I could live with it if it were documented somewhere.
(The fact that you need to twiddle with an existing Cabal package description is a whole other kettle of fish, of course...)
Huh, I was under the impression that you could just use unix-style file paths in a relative fashion with Cabal even on Windows...
Uhuh, and how do you specify whether the files are under C: or D:?
If you're only trying to _read_ the file, both of these are pretty self-explanatory. You only need to worry about the technical details if you try to _write_ them. And there are resources across the face of the Internet explaining in minute detail everything you could possibly want to know. For Cabal's home-brew file format, you've got... the terse notes in the Cabal documentation. And that's it.
And looking up other .cabal files... ;-)
Oh don't even start about that... That's more or less how I figured out how Haddock's undocumented "module attributes" feature works. By reading the Haddock comments for another package that had them. Unfortunately, I didn't write down what I learned, and now I've forgotten, and decided it's not worth the effort to boot!
Then again, even if we used XML or JSON we'd still have to look up what the write tag-names, etc. are.
Yes, we would. No matter what format you use, you'll have to look up the documentation. But if you're using a standard, well-known format, you *won't* have to spend time worrying about how escaping works or character encodings or line sizes or anything else. (And the Cabal people won't have to spend time designing all these details either.)
Did I say Emacs? Don't other editors/IDEs/etc. have the ability to syntax-highlight various files, generate code, etc.?
And whatever editor you use to edit Haskell code with, whilst you could write Haskell code in Notepad, etc. isn't it easier using an editor with at least syntax highlighting?
Unfortunately, I haven't found anything for Windows yet which has syntax hilighting for Haskell. I use SciTE, which has hilighting for a bazillion languages (including XML and JSON), but not Haskell sadly.
... except this "home grown thing" _is_ a file format.
Yeah. Just not a very nice one. And one that's supported by only one tool in the entire world.
Well, which other tools need to support it?
Well, the original design goal was apparently for Cabal packages to get converted into Debian .deb packages, RPMs, MSIs, and so forth, which implies package conversion tools being able to read it.
And from a brief bit of Googling, Ruby Gems seems to use its own (YAML-based) file format...
YAML is equally horrid. I hate it.
Now, Duncan et. al. are working on Cabal-2; it's quite possible that they're taking complaints like this into account, but I would much prefer to keep something like the current format (but with better specifications) than one using XML or JSON.
Now we're just arguing over asthetics. Besides, we all know the format won't be changed. It would break compatibility. (Man, now I remember why Haskell's slogan is "avoid success at all costs"...)

On 27 August 2010 21:40, Andrew Coppin
Ivan Lazar Miljenovic wrote:
I would point out what happened with X configuration recently: they went from an ini-style configuration file that was relatively human readable and editable (especially if you were basing your config off of a guide) to an XML-based one when they switched to HAL-based device management (since XML is easier for programs to read and write) and back to the original format because the XML-based format was a disaster.
Doesn't necessarily prove anything. It's much like saying "a know a guy who switched from a diesel to a petrol car, and it was a disaster, therefore we should ban all petrol cars". Your argument does not follow.
True; I was just providing one example of something switching to a "better" format (and actually I don't think it was the pain of XML that forced the change back, but the fact that HAL is going to be deprecated).
I personally find a file format such as Cabal's much easier to read and write than one that requires me to put a whole bunch of angled brackets in everywhere...
I find (X)HTML just fine to read. MathML, on the other hand, is a disaster.
However, (X)HMTL is designed to convey text, not metadata. An XML/JSON version of Cabal would (probably) have a higher ratio of angled brackets, etc. than a typical HTML document would.
[snip]
What all this proves is that XML can be horrid, or it can be just fine. Personally, I would have no problem with writing
<Name>foo</Name> <Version>1.0</Version> <Synopsis>This does stuff.</Synopsis>
Now I don't have to worry about whitespace; XML has rules for all that. And I don't have to worry about escaping or character sets; XML has rules for that too. And if that's too hard to swallow, how about JSON?
{ "Name": "foo", "Version": 1.0, "Synopsis": "This does stuff." }
Well, the JSON is definitely easier to read than the XML, but I would much prefer a really, really small signal to noise ratio. If we have to specify some more syntactic rules into .cabal, why not follow Haskell syntax for lists, etc.? Admittedly, Haskell has no multi-line String support which would make defining something like the "Description" field harder...
Again, I can now lay this out any way I want, and it's really pretty easy to read. About the only unfortunate feature is that the key names have to be quoted for some reason.
Isn't that by definition of JSON? :p
I've done bullet lists; it requires a slight change to the usual:
http://hackage.haskell.org/packages/archive/graphviz/2999.10.0.1/graphviz.ca... (note the `.'s in between lines).
Curly braces and dots? I don't see that documented anywhere. Presumably this is due to the stupid insistence on using whitespace to delimit things. If it were XML or JSON, you wouldn't need such silliness.
Well, you would if you wanted that passed to Haddock... (in that you might escape/unescape things too much/not enough). My guess is that the Haddock-isation of the Description field is a more recent hack to get it to work on Hackage and that was the neatest way they could work out how to do it whilst remaining backwards compatible.
Oh, I definitely agree with you that Haddock's markup leaves a lot to be desired.
The output too. But hey, I guess when I've written somebody better myself *then* I get to criticise...
Yeah... :s
Well, the original design goal was apparently for Cabal packages to get converted into Debian .deb packages, RPMs, MSIs, and so forth, which implies package conversion tools being able to read it.
Well, yes, but Cabal itself is a library so things like hackport, cabal2arch, etc. use Cabal to parse the .cabal file rather than writing their own parser.
And from a brief bit of Googling, Ruby Gems seems to use its own (YAML-based) file format...
YAML is equally horrid. I hate it.
After eventually managing to track down a Gemfile (one nice thing about Hackage: the .cabal files are linked to directly!), I tend to agree. However, it seems to be more akin to a Makefile than a .cabal file, in that there is no extra package metadata there.
Now, Duncan et. al. are working on Cabal-2; it's quite possible that they're taking complaints like this into account, but I would much prefer to keep something like the current format (but with better specifications) than one using XML or JSON.
Now we're just arguing over asthetics.
Besides, we all know the format won't be changed. It would break compatibility. (Man, now I remember why Haskell's slogan is "avoid success at all costs"...)
I believe that they're not going to care too much about compatability wrt Cabal 2; however porting everything is going to be difficult... However, this applies to your argument as well: we can't now switch to JSON or XML because of compatability. However, if you so wished, it might be possible to write an XML/JSON -> .cabal converter so you can write your configuration file in a format you prefer... -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

On 8/27/10, Ivan Lazar Miljenovic
Admittedly, Haskell has no multi-line String support which would make defining something like the "Description" field harder...
Quick correction: Haskell *does* have multi-line strings. For example: "This is a\ \ nice string" Note, however, that CPP doesn't like them. Cheers! =) -- Felipe.

Felipe Lessa wrote:
On 8/27/10, Ivan Lazar Miljenovic
wrote: Admittedly, Haskell has no multi-line String support which would make defining something like the "Description" field harder...
Quick correction: Haskell *does* have multi-line strings. For example:
"This is a\ \ nice string"
Note, however, that CPP doesn't like them.
Heh, well I'm not bothered about CPP, but knowing this might be useful someday...

On 28 August 2010 00:02, Felipe Lessa
On 8/27/10, Ivan Lazar Miljenovic
wrote: Admittedly, Haskell has no multi-line String support which would make defining something like the "Description" field harder...
Quick correction: Haskell *does* have multi-line strings. For example:
"This is a\ \ nice string"
I meant in the sense of Python, etc. where you didn't have to insert newline characters, etc. in. -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

Hi, Am Samstag, den 28.08.2010, 08:18 +1000 schrieb Ivan Lazar Miljenovic:
On 28 August 2010 00:02, Felipe Lessa
wrote: On 8/27/10, Ivan Lazar Miljenovic
wrote: Admittedly, Haskell has no multi-line String support which would make defining something like the "Description" field harder...
Quick correction: Haskell *does* have multi-line strings. For example:
"This is a\ \ nice string"
I meant in the sense of Python, etc. where you didn't have to insert newline characters, etc. in.
A similar problem is solved by http://github.com/jgm/hsb2hs which might be useful if you need to embed larger pieces of text. Greetings, Joachim -- Joachim "nomeata" Breitner mail: mail@joachim-breitner.de | ICQ# 74513189 | GPG-Key: 4743206C JID: nomeata@joachim-breitner.de | http://www.joachim-breitner.de/ Debian Developer: nomeata@debian.org

On 28 August 2010 20:55, Joachim Breitner
Hi,
Am Samstag, den 28.08.2010, 08:18 +1000 schrieb Ivan Lazar Miljenovic:
On 28 August 2010 00:02, Felipe Lessa
wrote: On 8/27/10, Ivan Lazar Miljenovic
wrote: Admittedly, Haskell has no multi-line String support which would make defining something like the "Description" field harder...
Quick correction: Haskell *does* have multi-line strings. For example:
"This is a\ \ nice string"
I meant in the sense of Python, etc. where you didn't have to insert newline characters, etc. in.
A similar problem is solved by http://github.com/jgm/hsb2hs which might be useful if you need to embed larger pieces of text.
Huh, very nice; thanks for the link. Unfortunately, it doesn't look like John has released a version on Hackage yet :( Then again, the first commit on github was the beginning of this month... -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

Hi, Am Samstag, den 28.08.2010, 21:11 +1000 schrieb Ivan Lazar Miljenovic:
A similar problem is solved by http://github.com/jgm/hsb2hs which might be useful if you need to embed larger pieces of text.
Huh, very nice; thanks for the link.
Unfortunately, it doesn't look like John has released a version on Hackage yet :(
Then again, the first commit on github was the beginning of this month...
probably won’t happen, unless someone steps up as a maintainer (you?): http://www.haskell.org/pipermail/haskell-cafe/2010-August/081398.html Hmm, now that I look at the thread, you took part in it (when the discussion turned to TH on different arches) :-) Greetings, Joachim -- Joachim "nomeata" Breitner mail: mail@joachim-breitner.de | ICQ# 74513189 | GPG-Key: 4743206C JID: nomeata@joachim-breitner.de | http://www.joachim-breitner.de/ Debian Developer: nomeata@debian.org

On 28 August 2010 21:33, Joachim Breitner
Hi,
Am Samstag, den 28.08.2010, 21:11 +1000 schrieb Ivan Lazar Miljenovic:
A similar problem is solved by http://github.com/jgm/hsb2hs which might be useful if you need to embed larger pieces of text.
Huh, very nice; thanks for the link.
Unfortunately, it doesn't look like John has released a version on Hackage yet :(
Then again, the first commit on github was the beginning of this month...
probably won’t happen, unless someone steps up as a maintainer (you?): http://www.haskell.org/pipermail/haskell-cafe/2010-August/081398.html
Hmm, now that I look at the thread, you took part in it (when the discussion turned to TH on different arches) :-)
Huh, that goes to show how many mailing list messages there are that I forgot about that thread ;-) I'd be willing to take over maintainership except I have about 3 different libraries I'm working on that need a lot of TLC, and I don't really have a use case for hsb2hs (I don't do that much with Strings). -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

On Fri, Aug 27, 2010 at 7:40 AM, Andrew Coppin
Unfortunately, I haven't found anything for Windows yet which has syntax hilighting for Haskell.
I use SciTE, which has hilighting for a bazillion languages (including XML and JSON), but not Haskell sadly.
Veering somewhat offtopic, but last time I checked, SciTE does have lexer support for Haskell, it just doesn't actually include (for unknown reasons) a language properties file to go with it. If you give it one, syntax highlighting mostly works. You can write your own if you like--the .properties files have a pretty simple "property.name=value" syntax, which is mildly amusing in the context of this email thread--or borrow someone else's, such as this one: http://www4.in.tum.de/~haftmann/resources/haskell.properties A few tweaks in the global properties are required to get everything working--I don't remember the details, but it didn't take me long to figure it out. Also, on Windows, I'm aware of at least Notepad++ that has some very basic syntax highlighting for Haskell working out of the box. It's based on Scintilla, as well, so should feel comfortable to someone accustomed to SciTE. - C.

There is also Leksah and GVim
On Fri, Aug 27, 2010 at 2:14 PM, C. McCann
On Fri, Aug 27, 2010 at 7:40 AM, Andrew Coppin
wrote: Unfortunately, I haven't found anything for Windows yet which has syntax hilighting for Haskell.
I use SciTE, which has hilighting for a bazillion languages (including XML and JSON), but not Haskell sadly.
Veering somewhat offtopic, but last time I checked, SciTE does have lexer support for Haskell, it just doesn't actually include (for unknown reasons) a language properties file to go with it. If you give it one, syntax highlighting mostly works. You can write your own if you like--the .properties files have a pretty simple "property.name=value" syntax, which is mildly amusing in the context of this email thread--or borrow someone else's, such as this one: http://www4.in.tum.de/~haftmann/resources/haskell.properties A few tweaks in the global properties are required to get everything working--I don't remember the details, but it didn't take me long to figure it out.
Also, on Windows, I'm aware of at least Notepad++ that has some very basic syntax highlighting for Haskell working out of the box. It's based on Scintilla, as well, so should feel comfortable to someone accustomed to SciTE.
- C. _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On 28 Aug 2010, at 04:58, Andrew Coppin wrote:
Mathew de Detrich wrote:
There is also Leksah and GVim
The Leksah that I recently noted can't be built on Windows?
I am not familiar with that Leksah :-) Seriously though here is one way to build it... Install Haskell Platform Use TakeOffGW (http://sourceforge.net/projects/takeoffgw/) to install : * pkg-config (in the Devel section of the installer) * gtksourceview (Libs) * gtksourceview-devel (Libs). * automake1.9-bin (Devel) * msys-bash-bin (MSYS) * msys-gawk-bin (MSYS) * msys-grep-bin (MSYS) * msys-sed-bin (MSYS) * msys-tar-bin (MSYS) (Once GHC 6.14.1 is released, only the first three should be needed, as the rest are needed to build process-leksah). Make sure the following are in your PATH : * <Haskell Platform>\mingw\bin * c:\mingw\i686-pc-mingw32\sys-root\mingw\bin (needed mainly for pkg-config) * <Your Home Folder>\AppData\Roaming\cabal\bin (will be needed for the gtk2hs build tools once they are installed) * c:\mingw\bin Also add the following to your environment : * PKG_CONFIG_PATH=/i686-pc-mingw32/sys-root/mingw/lib/pkgconfig C:\>cabal install gtk2hs-buildtools C:\>cabal install leksah C:\>leksah If you get stuck jump on IRC #leksah or email the leksah group. Hamish

On 29 August 2010 13:24, Hamish Mackenzie
Use TakeOffGW (http://sourceforge.net/projects/takeoffgw/) to install :
Hi Hamish Does TakeOffGW work well in practice? The intentions behind it are admirable but at the moment it seems rather new.

On 30 Aug 2010, at 00:55, Stephen Tetley wrote:
On 29 August 2010 13:24, Hamish Mackenzie
wrote: Use TakeOffGW (http://sourceforge.net/projects/takeoffgw/) to install :
Hi Hamish
Does TakeOffGW work well in practice? The intentions behind it are admirable but at the moment it seems rather new.
They have been able to leverage a lot of the windows cross compilation work done in SUSE. TakeoffGW packages are built on SUSE and it uses the cygwin installer to download and install them. Cygwin installer is fairly mature (has a number of quirks but it works). So far I have only used it to build Leksah, but it has a lot of packages available. Hamish

Isn't there a binary for Leksah on the main site for windows anyways? On 29/08/2010 10:24 PM, "Hamish Mackenzie" < hamish.k.mackenzie@googlemail.com> wrote: On 28 Aug 2010, at 04:58, Andrew Coppin wrote:
Mathew de Detrich wrote:
There is also Leksah a... I am not familiar with that Leksah :-)
Seriously though here is one way to build it... Install Haskell Platform Use TakeOffGW (http://sourceforge.net/projects/takeoffgw/) to install : * pkg-config (in the Devel section of the installer) * gtksourceview (Libs) * gtksourceview-devel (Libs). * automake1.9-bin (Devel) * msys-bash-bin (MSYS) * msys-gawk-bin (MSYS) * msys-grep-bin (MSYS) * msys-sed-bin (MSYS) * msys-tar-bin (MSYS) (Once GHC 6.14.1 is released, only the first three should be needed, as the rest are needed to build process-leksah). Make sure the following are in your PATH : * <Haskell Platform>\mingw\bin * c:\mingw\i686-pc-mingw32\sys-root\mingw\bin (needed mainly for pkg-config) * <Your Home Folder>\AppData\Roaming\cabal\bin (will be needed for the gtk2hs build tools once they are installed) * c:\mingw\bin Also add the following to your environment : * PKG_CONFIG_PATH=/i686-pc-mingw32/sys-root/mingw/lib/pkgconfig C:\>cabal install gtk2hs-buildtools C:\>cabal install leksah C:\>leksah If you get stuck jump on IRC #leksah or email the leksah group. Hamish_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-c...

C. McCann wrote:
On Fri, Aug 27, 2010 at 7:40 AM, Andrew Coppin
wrote: Unfortunately, I haven't found anything for Windows yet which has syntax hilighting for Haskell.
I use SciTE, which has hilighting for a bazillion languages (including XML and JSON), but not Haskell sadly.
Veering somewhat offtopic, but last time I checked, SciTE does have lexer support for Haskell, it just doesn't actually include (for unknown reasons) a language properties file to go with it.
OK. Well maybe it's just the version I've got then? Or maybe, as you say, because it's not enabled I don't know it's there.
If you give it one, syntax highlighting mostly works. You can write your own if you like--the .properties files have a pretty simple "property.name=value" syntax, which is mildly amusing in the context of this email thread--or borrow someone else's, such as this one: http://www4.in.tum.de/~haftmann/resources/haskell.properties A few tweaks in the global properties are required to get everything working--I don't remember the details, but it didn't take me long to figure it out.
Ah yes. The reason I seldom update SciTE is that it then takes hours to put all the configuration back to the way I like it. (Especially if option names have changed or defaults are different now.) SciTE is a nice editor, but not especially well documented. (And, what, they haven't made a configuration editor yet? :-P )

On Aug 27, 2010, at 11:40 PM, Andrew Coppin wrote:
What all this proves is that XML can be horrid, or it can be just fine. Personally, I would have no problem with writing
<Name>foo</Name> <Version>1.0</Version> <Synopsis>This does stuff.</Synopsis>
Now I don't have to worry about whitespace; XML has rules for all that.
Multiple conflicting sets of rules, in fact. There _is_ a notion of "element content white space", however, non-validating parsers like expat can't KNOW which white space is ecws and which is not.
Huh, I was under the impression that you could just use unix-style file paths in a relative fashion with Cabal even on Windows...
Uhuh, and how do you specify whether the files are under C: or D:?
Since early MS-DOS days, / and \ have been interchangeable except in the command language. If you want C:\FOO\BAR to pass to a system function, C:/FOO/BAR will work just as well.

Andrew Coppin wrote:
Windows has more package management facilities than most people realise.
For example, go install Office 2007. In fact, just install Excel 2007, not the whole thing. Windows Installer can automatically figure out that you *do* need to install the Spell Checker (since Excel uses that), but you do *not* need to install the Grammar Checker (since only Word and PowerPoint use that, and you haven't selected to install those). Not only does it decide what to install, but you can query it
Thats a specific installer for a specific program. The whole problem with windows is that every 3rd party program is responsible for its own installation and removal and is free to do that in its own way. It also encourgaes monolithic installers, installers that include everything. However, the software you are complaining about is mostly FOSS software that had its genesis on Linux/Unix and assumes that build dependencies can be resolved at compile time and that install dependencies can be resolved at install time. Windows of course fails these two assumptions completely. Until something like apt-get becomes popular, widespread and well supported, you are going to continue to feel pain. I suggest that you throw you support behind something like GetIt: http://www.puchisoft.com/GetIt/ because hoping that Linux and Mac devs will fix windows problems is not going to get you anywhere.
I guess it depends on whether you think your students are going into datacenter support (probably Unix) or desktop support or application development (obviously all desktops are Windows).
Do you know the parable of the blind men and the elephant? https://secure.wikimedia.org/wikipedia/en/wiki/Blind_men_and_an_elephant Here's a funny thing. I know a large number of professional software engineers and people who mix that with sys admin work. Only a tiny fraction of those people write code for the windows platform. Do I conclude from my data that most developers develop for Linux? Erik -- ---------------------------------------------------------------------- Erik de Castro Lopo http://www.mega-nerd.com/

Erik de Castro Lopo wrote:
Andrew Coppin wrote:
Windows has more package management facilities than most people realise.
For example, go install Office 2007. In fact, just install Excel 2007, not the whole thing. Windows Installer can automatically figure out that you *do* need to install the Spell Checker (since Excel uses that), but you do *not* need to install the Grammar Checker (since only Word and PowerPoint use that, and you haven't selected to install those). Not only does it decide what to install, but you can query it
Thats a specific installer for a specific program.
It is. But my point is, it's not the program installer itself that provides all this functionallity. It's the Windows operating system. A 3rd party program can, for example, ask if component X of MS Office is installed, and if so, where it's located. You can set things up so that if some 3rd party needs to access a feature of MS Office that isn't currently installed, it installs it (if the installation files are cached; it used to ask for you to insert the install CD). For native Windows stuff, there are ways to check what version of DirectX is installed and update it if necessary; you don't have to write that code yourself, it's provided by the OS. Just like under (say) Debian, you don't have to implement apt-get yourself, it's provided for you. About the only thing *not* provided is the ability to automatically fetch the installer for whatever it is you need. And that's because usually to obtain the installer, you need to *pay money* to whatever company it is that *sells* it.
The whole problem with windows is that every 3rd party program is responsible for its own installation and removal and is free to do that in its own way. It also encourgaes monolithic installers, installers that include everything.
Each application can of course install itself in its own way. For some, that's as simple as unzipping an install image and putting it somewhere convinient. For others, it means running an elaborate installation program. But, the point I'm trying to get at is, there *is* a standard installation system (which you can of course choose not to package your application with), and if you use it, it gives you things like dependency resolution and telling you what stuff is installed where and so on. I agree with your last point, however; not having a central location where software can be obtained from *does* encourage monolithic installers. The installer is essentially a mini repo that contains the package and all its dependences; it then uses Windows to decide which [if any] of those dependencies are already installed or else need to be installed.
However, the software you are complaining about is mostly FOSS software that had its genesis on Linux/Unix and assumes that build dependencies can be resolved at compile time and that install dependencies can be resolved at install time. Windows of course fails these two assumptions completely.
On Linux, if I do, say, "cabal install zlib", it falls over and tells me it can't find the zlib headers. So I go install them, rerun the command, and it works. On Windows, I issue the same command and it falls over and says that autoconf doesn't exist. It doesn't even *get* to the part where it looks for header files! Interestingly, even though everybody claims that it's "impossible" to support C bindings on Windows, gtk2hs has managed it somehow. If you try to built it, it complains that it can't find the GTK+ headers. Go install those, add them to the search path, and suddenly it builds just fine. No problems with it. Go figure...
Until something like apt-get becomes popular, widespread and well supported, you are going to continue to feel pain.
As I say, gtk2hs builds just fine today. (The upstream packaging of the Windows GTK+ port leaves a little to be desired, but that's not a Haskell problem.) It's as trivial as unpacking a few zip files and tweaking the search path.
hoping that Linux and Mac devs will fix windows problems is not going to get you anywhere.
How about hoping that Linux and Mac devs are going to realise that Windows doesn't have some of the problems that people claim it does? Hmm, thinking about it... nah, that's not happening anytime soon either. ;-)
I guess it depends on whether you think your students are going into datacenter support (probably Unix) or desktop support or application development (obviously all desktops are Windows).
Here's a funny thing. I know a large number of professional software engineers and people who mix that with sys admin work. Only a tiny fraction of those people write code for the windows platform. Do I conclude from my data that most developers develop for Linux?
...which leads us back to my "I guess it depends" then? My "obviously all desktops are Windows" was not meant to be entirely serious. But it's not exactly a revelation to state that Windows has much greater penetration in the desktop market than either Linux or indeed Mac OS. Linux is much more popular now than it used to be (e.g., I can remember when you had to wear open-toed sandals and eat lentil burgers in order to run Linux), but it's not yet anywhere near the level of popularity of Windows.

On 28/08/10 09:55, Andrew Coppin wrote: [...]
How about hoping that Linux and Mac devs are going to realise that Windows doesn't have some of the problems that people claim it does?
Hmm, thinking about it... nah, that's not happening anytime soon either. ;-)
Can you provide some links to further information, please? /M -- Magnus Therning (OpenPGP: 0xAB4DFBA4) magnus@therning.org Jabber: magnus@therning.org http://therning.org/magnus identi.ca|twitter: magthe

Andrew Coppin wrote:
On Linux, if I do, say, "cabal install zlib", it falls over and tells me it can't find the zlib headers. So I go install them, rerun the command, and it works. On Windows, I issue the same command and it falls over and says that autoconf doesn't exist. It doesn't even *get* to the part where it looks for header files!
You are trying to build code that is designed on and for Linux. As such it will probably work on all variants of Linux, Mac OSX and a majority of Unix variants (after installation of the required GNU tools). Unsurprisingly it does work on windows because windows because windows does just about everything differently to the Linux and the rest of the world does it.
Interestingly, even though everybody claims that it's "impossible" to support C bindings on Windows, gtk2hs has managed it somehow. If you try to built it, it complains that it can't find the GTK+ headers. Go install those, add them to the search path, and suddenly it builds just fine. No problems with it. Go figure...
The reason that works is probably because whoever released it had a windows machine available and took the time to make it work. In general, code written on and for Linux/Unix is not going compile with little problem on most Unix-style OSes and close to zero chance of compiling without siginficant work on windows.
How about hoping that Linux and Mac devs are going to realise that Windows doesn't have some of the problems that people claim it does?
The problems I claim windows has with respect to compiling and installing FOSS: a) No standard place to find C include files. b) No standard place to find libraries. c) No standard way to find if common open source libraries are installed and where. d) Missing common unix tools like bash. awk, sed, grep, make, autoconf, automake, libtool, pkg-config etc. Ideally for installing open source libraries the tools used should be the same as the ones used on Linux/Unix where they originated.
My "obviously all desktops are Windows" was not meant to be entirely serious. But it's not exactly a revelation to state that Windows has much greater penetration in the desktop market than either Linux or indeed Mac OS. Linux is much more popular now than it used to be (e.g., I can remember when you had to wear open-toed sandals and eat lentil burgers in order to run Linux), but it's not yet anywhere near the level of popularity of Windows.
Your assessment is valid for user desktops but highly questionable for developer desktops. Erik -- ---------------------------------------------------------------------- Erik de Castro Lopo http://www.mega-nerd.com/

Hmm, Sunday morning reply before caffeine. Erik de Castro Lopo wrote:
Andrew Coppin wrote:
On Linux, if I do, say, "cabal install zlib", it falls over and tells me it can't find the zlib headers. So I go install them, rerun the command, and it works. On Windows, I issue the same command and it falls over and says that autoconf doesn't exist. It doesn't even *get* to the part where it looks for header files!
You are trying to build code that is designed on and for Linux. As such it will probably work on all variants of Linux, Mac OSX and a majority of Unix variants (after installation of the required GNU tools).
Unsurprisingly it does work on windows because windows because windows
^not
does just about everything differently to the Linux and the rest of the world does it.
Interestingly, even though everybody claims that it's "impossible" to support C bindings on Windows, gtk2hs has managed it somehow. If you try to built it, it complains that it can't find the GTK+ headers. Go install those, add them to the search path, and suddenly it builds just fine. No problems with it. Go figure...
The reason that works is probably because whoever released it had a windows machine available and took the time to make it work.
In general, code written on and for Linux/Unix is not going compile
Remove 'not' in the line above.
with little problem on most Unix-style OSes and close to zero chance of compiling without siginficant work on windows.
Erik -- ---------------------------------------------------------------------- Erik de Castro Lopo http://www.mega-nerd.com/

Erik de Castro Lopo wrote:
Andrew Coppin wrote:
On Linux, if I do, say, "cabal install zlib", it falls over and tells me it can't find the zlib headers. So I go install them, rerun the command, and it works. On Windows, I issue the same command and it falls over and says that autoconf doesn't exist. It doesn't even *get* to the part where it looks for header files!
You are trying to build code that is designed on and for Linux. As such it will probably work on all variants of Linux, Mac OSX and a majority of Unix variants (after installation of the required GNU tools).
Unsurprisingly it does work on windows because windows because windows does just about everything differently to the Linux and the rest of the world does it.
The C zlib library itself works just fine on just about every platform I'm aware of (including Windows). What doesn't work is the Haskell binding to it. (And even that does work if you can get it to build - e.g., IIRC zlib is now in HP.)
Interestingly, even though everybody claims that it's "impossible" to support C bindings on Windows, gtk2hs has managed it somehow.
The reason that works is probably because whoever released it had a windows machine available and took the time to make it work.
In general, code written on and for Linux/Unix is not going compile with little problem on most Unix-style OSes and close to zero chance of compiling without siginficant work on windows.
Much like the chance of OSS written for Windows working under Unix without a lot of work. (Yes, people write OSS for Windows too.)
The problems I claim windows has with respect to compiling and installing FOSS:
a) No standard place to find C include files. b) No standard place to find libraries. c) No standard way to find if common open source libraries are installed and where.
As best as I can tell, the Unix Standard Way(tm) to do this kind of thing is to put files into "well known" locations so that they can be easily found. (The fact that tools like autoconf need to exist tells you something about how tricky this can be.) The Windows Standard Way(tm) is to install your stuff wherever you like, and then record its location in the Registry. Generally this applies more to run-time resources; I'll grant you that compile-time resources are more tricky. I think the expectation is that you'll use some kind of IDE, configure it to say where the header files and so forth are, and it will generate the appropriate command strings to build and link everything. For whatever reason, developers not native to Windows tend to avoid the Registry, so libraries ported to Windows (rather than native to it) tend not to register themselves in the Registry. So that's still perhaps not much help here. Regardless, you'd think Cabal could provide some way to make it "easy" to state where the files it needs actually are. Currently it does not.
d) Missing common unix tools like bash. awk, sed, grep, make, autoconf, automake, libtool, pkg-config etc.
If all you're trying to do is compile some Haskell files that include a few C headers, arguably you couldn't need any of these things (except perhaps for discovering where your files are). If you were trying to build the underlying C library itself... well *that* is another task entirely. C tends to be highly non-portable. Fortunately, most interesting C libraries have already been ported by some kind soul; it's just a question of building the Haskell bindings.
Ideally for installing open source libraries the tools used should be the same as the ones used on Linux/Unix where they originated.
Not all open source libraries originate on Unix. Some of them are actually native to Windows. But anyway, that's tangental to this discussion...
My "obviously all desktops are Windows" was not meant to be entirely serious. But it's not exactly a revelation to state that Windows has much greater penetration in the desktop market than either Linux or indeed Mac OS.
Your assessment is valid for user desktops but highly questionable for developer desktops.
I can go along with that.

On 29 August 2010 21:46, Andrew Coppin
The problems I claim windows has with respect to compiling and installing FOSS:
a) No standard place to find C include files. b) No standard place to find libraries. c) No standard way to find if common open source libraries are installed and where.
As best as I can tell, the Unix Standard Way(tm) to do this kind of thing is to put files into "well known" locations so that they can be easily found. (The fact that tools like autoconf need to exist tells you something about how tricky this can be.)
No, autoconf has nothing to do with where the "well known" locations are. It's having to deal with different versions, etc. of libraries.
The Windows Standard Way(tm) is to install your stuff wherever you like, and then record its location in the Registry. Generally this applies more to run-time resources; I'll grant you that compile-time resources are more tricky. I think the expectation is that you'll use some kind of IDE, configure it to say where the header files and so forth are, and it will generate the appropriate command strings to build and link everything.
On Unix, most things used shared libraries; in Windows you typically seem to bundle libraries a lot more often (which then annoys distro developers when they have to clean up the resulting mess; Firefox is a prime culprit of this).
Regardless, you'd think Cabal could provide some way to make it "easy" to state where the files it needs actually are. Currently it does not.
Well, it uses ghc-pkg to record where the various libraries, etc. are. Otherwise, it could be that none of the Cabal developers are really that familiar with the "best practices" of developing Windows software (and clobbering the registry whilst your at it). By the way, is it possible to have a globally installed library in Windows (for C, etc.) that can be used no matter which IDE or editor you use? Or does each IDE manage all that on its own?
Ideally for installing open source libraries the tools used should be the same as the ones used on Linux/Unix where they originated.
Not all open source libraries originate on Unix. Some of them are actually native to Windows. But anyway, that's tangental to this discussion...
Though I know of more libraries that originate and *nix and migrate to Windows than the other way round...
My "obviously all desktops are Windows" was not meant to be entirely serious. But it's not exactly a revelation to state that Windows has much greater penetration in the desktop market than either Linux or indeed Mac OS.
Your assessment is valid for user desktops but highly questionable for developer desktops.
I can go along with that.
After all, who cares about users? :p -- Ivan Lazar Miljenovic Ivan.Miljenovic@gmail.com IvanMiljenovic.wordpress.com

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 8/29/10 08:05 , Ivan Lazar Miljenovic wrote:
On 29 August 2010 21:46, Andrew Coppin
wrote: a) No standard place to find C include files. b) No standard place to find libraries. c) No standard way to find if common open source libraries are installed and where.
As best as I can tell, the Unix Standard Way(tm) to do this kind of thing is to put files into "well known" locations so that they can be easily found. (The fact that tools like autoconf need to exist tells you something about how tricky this can be.)
No, autoconf has nothing to do with where the "well known" locations are. It's having to deal with different versions, etc. of libraries.
Not entirely true; it also deals (or used to deal) with the fact that you may have stuff in /opt/SUNWsft (Solaris), /opt/kde (SuSE), /usr/local, /opt/local (MacPorts), /sw (Fink), etc. Then again, that's what pkg-config deals with these days, leaving autoconf to deal with different APIs/ABIs (different versions, different build options) as long as the software you're building is up to date.
Regardless, you'd think Cabal could provide some way to make it "easy" to state where the files it needs actually are. Currently it does not.
Well, it uses ghc-pkg to record where the various libraries, etc. are. Otherwise, it could be that none of the Cabal developers are really that familiar with the "best practices" of developing Windows software (and clobbering the registry whilst your at it).
Note that Cabal is no better at tracking location of non-Haskell resources on Unix. (In fact, isn't that what started this topic?)
By the way, is it possible to have a globally installed library in Windows (for C, etc.) that can be used no matter which IDE or editor you use? Or does each IDE manage all that on its own?
DLLs can be put into C:\WINDOWS\SYSTEM32 or equivalent (e.g. Windows NT liked to install itself in C:\WINNT instead of C:\WINDOWS). LIB files are less standard and I'm under the impression that every IDE uses its own notion of where to put them (and may not use the registry in a non-opaque way). BTW, if there *is* some standard registry tree that can be used for this, it should be possible to provide a Windows version of pkg-config that would hide most of this. Replacing autoconf is harder, though it might be possible to work from configure.in (or even configure.am when automake is involved). - -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkx6k60ACgkQIn7hlCsL25UeugCdGEYqFmU0GstywKpFaTwDTGHn 1+cAoNNz8A7sM/XM2GCQdeFSTY/ML5hY =eK9N -----END PGP SIGNATURE-----

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 8/29/10 13:06 , Brandon S Allbery KF8NH wrote:
may have stuff in /opt/SUNWsft (Solaris), /opt/kde (SuSE), /usr/local,
Wrong path for Solaris. *sigh* We don't use Sun's OSS package, in part because we don't have anything newer than Solaris 9. - -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkx6lDwACgkQIn7hlCsL25Ud+gCfdGrPKCcGTFWcDwoYvFxEbo4T j0sAn1vUOANtSHDL/F9iLnuMYdjVauEz =LnFe -----END PGP SIGNATURE-----

On 29 August 2010 18:06, Brandon S Allbery KF8NH
DLLs can be put into C:\WINDOWS\SYSTEM32 or equivalent (e.g. Windows NT liked to install itself in C:\WINNT instead of C:\WINDOWS). LIB files are less standard and I'm under the impression that every IDE uses its own notion of where to put them (and may not use the registry in a non-opaque way).
BTW, if there *is* some standard registry tree that can be used for this, it should be possible to provide a Windows version of pkg-config that would hide most of this. Replacing autoconf is harder, though it might be possible to work from configure.in (or even configure.am when automake is involved).
Windows has a standard place for header files <path-to-MinGW>\MinGW\include Similary for .a's and .o's: <path-to-MinGW>\MinGW\lib For "/usr/local" installs the path is: <path-to-msys>\msys\1.0\local with bin, include, lib and share comfortably placed in local. ./configure && make && make install will do the right thing for installing source packages. Binary packages are available from MinGW's repository. Its a defacto standard, but its still a standard. If people are using Cygwin or Microsoft's Unix compatibility layer, Visual C or even the parts of MinGW distributed with GHC, they aren't documenting their successes so no-one else can follow them, for all intents and purposes MinGW/Msys is the only game in town. [Caveat - Cygwin is fine for developing if you just want a good shell and aren't working with FFI bindings].

Stephen Tetley wrote:
Windows has a standard place for header files
<path-to-MinGW>\MinGW\include
Isn't that "MinGW has a standard place for header files"? I'm guessing if you use DJGPP or MS VisualStudio or Borland C++, it's not going to look there (unless you tell it to).

On 30 August 2010 11:26, Andrew Coppin
Stephen Tetley wrote:
Windows has a standard place for header files
<path-to-MinGW>\MinGW\include
Isn't that "MinGW has a standard place for header files"?
Strictly speaking its "Haskell-on-Windows has a standard place for header files".

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 8/30/10 06:26 , Andrew Coppin wrote:
Stephen Tetley wrote:
<path-to-MinGW>\MinGW\include
Isn't that "MinGW has a standard place for header files"?
I'm guessing if you use DJGPP or MS VisualStudio or Borland C++, it's not going to look there (unless you tell it to).
Presumably that's what he meant by
Its a defacto standard, but its still a standard. If people are using Cygwin or Microsoft's Unix compatibility layer, Visual C or even the parts of MinGW distributed with GHC, they aren't documenting their successes so no-one else can follow them, for all intents and purposes MinGW/Msys is the only game in town.
Again (echoing both the above and an earlier message of mine): if you can tell us(*) what to do to make things visible to VS, please do. Or contribute patches to Cabal that interoperate with VS. (*) generic "us"; I don't currently work on Cabal - -- brandon s. allbery [linux,solaris,freebsd,perl] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.10 (Darwin) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkx7zscACgkQIn7hlCsL25UKLwCfQwvnWnL5t7o55ZB8ZUq/IZhq ul0An0LoTBJQBIC3UVM+eZQcv5OfJ9K8 =sHJ2 -----END PGP SIGNATURE-----

Regardless, you'd think Cabal could provide some way to make it "easy" to state where the files it needs actually are. Currently it does not.
Well, it uses ghc-pkg to record where the various libraries, etc. are. Otherwise, it could be that none of the Cabal developers are really that familiar with the "best practices" of developing Windows software (and clobbering the registry whilst your at it).
Note that Cabal is no better at tracking location of non-Haskell resources on Unix. (In fact, isn't that what started this topic?)
Indeed. Windows lacks a standard place for developer files.
By the way, is it possible to have a globally installed library in Windows (for C, etc.) that can be used no matter which IDE or editor you use? Or does each IDE manage all that on its own?
DLLs can be put into C:\WINDOWS\SYSTEM32 or equivalent.
More correctly: DLLs can be put *anywhere you like*, so long as they are registered in the Registry. And yes, it's quite possible to install libraries system-wide. Take DirectX, for example. Just about every Windows computer game installer begins by testing whether DirectX is installed, what version, and installing the version from the CD if newer (which it almost never is). [Actually, the cheap games just run the installer whatever; fortunately, the MS installer checks, sees it's already installed, and doesn't reinstall it again.] The DirectX DLLs aren't installed in a standard place; they're registered in the Registry, and that's how you find them. It's all about the Registry.
LIB files are less standard and I'm under the impression that every IDE uses its own notion of where to put them (and may not use the registry in a non-opaque way).
As best as I can tell, Windows has a standard way to locate *run-time* resources (e.g., DLLs), but not for *compile-time* resources (header files, LIB files, etc.) As I say, I think the intention is that once you've built the thing, it should run on any Windows box - which means it needs to be able to find stuff. But on the development box, you're expected to do the legwork to tell it where stuff is (or rather, tell the IDE which then manages stuff for you). Windows really is GUI-oriented rather than CLI-oriented. Perhaps you've noticed...

I agree with this comment in regards to cabal building binaries for similar
reasons that John Macheam is. Cabal is fine for libraries (in fact I can
classify it as pretty damn good) but for binaries it is a different matter
for programs that don't use a simple build system/structure. Cabal is just
too inflexible, and then you have the issues with no uninstall (if the
binary also happens to create libraries) which just makes things more
painful
In all honesty, using a CMAKE like system for Haskell would have had a lot
more advantages then cabal for binaries.
The minute you have to install binaries on Windows using cabal that bring in
C (and other dependencies) and things are much more complicated then they
should be and are likely to break far too often then is considered sane
In my opinion this is one of the few things that is holding Haskell back in
regards to it being adopted by Windows users (among other things)
On Fri, Aug 27, 2010 at 9:13 AM, Andrew Coppin
Simon Marlow wrote:
If you look at the original Cabal design document[1], you'll see that one of the goals of Cabal was to be the glue that lets you convert an arbitrary Haskell library into a native package for a variety of systems - including MSIs on Windows. Indeed, I must admit when we were designing Cabal I thought that native packages would be the most common way that people would install Cabal packages, specifically because many systems already have a good package manager, and trying to bypass the system package manager would be a fundamental mistake. It turned out that cabal-install would be a lot more useful than I imagined, but the two systems are complementary: native packages are for installing globally, and cabal-install is for installing packages in your home directory.
Why would you ever want to install a package per-user? I mean, if you don't have permission to do a global install, then you also don't have permission to install GHC in the first place so...? Indeed, the *only* plausible reason I can think of is if you're trying to build something that has unusual package version constraints, and you want to build it without upsetting the entire system.
Even on systems without a package manager (i.e. Windows), it would make
more sense when installing a package globally to build an MSI first, so that the system can track the installation and let you uninstall it later.
I did have a look at building a binary installer using Nullsoft NSIS. Unfortunately, I don't know of any tool in existence that can build MSI files that isn't absurdly expensive. (E.g., InstallShield is ~£4,000, which is extortionate for a program that just copies files around. Even BackupExec isn't *that* expensive, and that's mission-critical!)
Of course, *I* was looking at NSIS specifically for installing Haskell-to-C bindings. These are virtually impossible to build on Windows, and I figured if I could build such a package once, I could then make a binary installer out of it and never again have to build it from source. (Until the next GHC version, anyway.) But I utterly failed to make the building part work, so I never got to the next bit.
If you were to use binary installers for regular Haskell packages, the only real benefit would be that you can now UNinstall them again. It might be worth doing that, and it looks plausible that you could automate it...
[1] http://www.haskell.org/cabal/proposal/
Interesting. So Cabal was never intended to handle executables at all. (The entire proposal speaks only about *libraries*.) Suddenly several of Cabal's deficiencies make a lot more sense. It doesn't handle executables properly because it was never designed to. It doesn't uninstall because Cabal packages are supposed to be converted into real packages first, and real package managers provide uninstall capabilities. And so on.
It's slightly disturbing how the proposal meantions "make" every three sentences. You realise that make only exists under Unix, right? There _are_ other operating systems out there...
I also can't for the life of me work out why something *designed for* automatic processing is designed without machine-readable syntax. Even in this early proposal, Cabal is already using that horrid ad hoc undocumented file format that only Cabal itself can actually parse and understand. Why not XML or JSON or *something* with a formal spec and a wide range of available tools? It makes no sense at all. And in case somebody is sitting there thinking "It IS documented. It's simple, isn't it?", did you know that file paths have to be escaped like Haskell string literals? No, I bet you didn't. Where is this fact documented? It isn't. Why was this decided? I'm guessing it's an implementation accident rather than a deliberate decision. Now if this were XML or JSON, everybody would already *know* the escaping rules. And we'd have tools that know about these rules and can handle processing such files. People seem to think that Cabal's existing format makes it easier for humans to read and write, but personally I'm always left wondering exactly which constructions are or aren't permitted. Can I put several values on a line here, or do they have to be seperate lines? Do all the field values have to be indented by the same amount? How does Cabal figure out which fields are subfields anyway?
In summary, I would be *so* much happier if we had a real file format rather than this ugly home-grown thing. Unfortunately, this would break everything on Hackage, so it will never be fixed.
[2] http://www.haskell.org/pipermail/cabal-devel/2007-August/000740.html
Also interesting. I've never heard of WIX before...
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On Aug 27, 2010, at 9:13 PM, Andrew Coppin wrote:
Why would you ever want to install a package per-user? I mean, if you don't have permission to do a global install, then you also don't have permission to install GHC in the first place so...? Indeed, the *only* plausible reason I can think of is if you're trying to build something that has unusual package version constraints, and you want to build it without upsetting the entire system.
Scenario: A University Computer Science Department. One or more laboratories bung full of machines. A shared file server with rack upon rack of discs holding all the student-accessible files, in another room. Administrators install required software before each semester. If you are lucky, this includes Haskell. A student needs a package. The student *CAN'T* do a global install. The administrators are busy. The student CAN do a per-user install. The day is saved! Much cheering. So the answer to the question is obvious: you want to install a package per-user if you are NOT THE SAME PERSON as the one who originally installed GHC.
It's slightly disturbing how the proposal meantions "make" every three sentences. You realise that make only exists under Unix, right? There _are_ other operating systems out there...
Like OpenVMS, which has MMS (Dec's clone of Make), MMK (a clone of MMS), GNU Make, and by now several other ports of make. Like Windows, which has had NMake for yonks, plus ports of other makes, e.g., GNU Make at http://unxutils.sourceforge.net/ (not Cygwin, native Win32).

Hi, Am Sonntag, den 22.08.2010, 10:55 +0100 schrieb Andrew Coppin:
Browsing around Hackage, I notice that a seemingly random subset of packages are available for something called "arch linux". Presumably some sort of automatic conversion system is involved, but does anyone know why only certain packages appear?
I've noticed that both Debian and OpenSUSE have a very tiny selection of binary Haskell packages too. I'm guessing that these packages are also auto-generated, but presumably selected by hand. (I also don't recall seeing them listed on Hackage.) Anybody know about that?
I wouldn’t call almost 200 packages¹ a „very tiny selection“ :-) These packages are not auto-generated, but still hand-built and hand-uploaded in every version. The Haskell Team makes selects the packages, decides whether a version updated is required (for example changes that only fix the buildability on win32 do not warrant an upload to Debian) and fixes bugs. This should be a very stable base with most important libraries to build on, without any "cabal hell". More information can be found on http://wiki.debian.org/Haskell. The distro listing on hackages was actually implemented by me a while ago, the text file Ivan mentioned can be found on http://people.debian.org/~nomeata/cabalDebianMap.txt and is generated daily by a cron job. Greetings, Joachim ¹ http://pkg-haskell.alioth.debian.org/cgi-bin/pet.cgi plus a few packages not maintained by the team -- Joachim "nomeata" Breitner mail: mail@joachim-breitner.de | ICQ# 74513189 | GPG-Key: 4743206C JID: nomeata@joachim-breitner.de | http://www.joachim-breitner.de/ Debian Developer: nomeata@debian.org
participants (21)
-
aditya siram
-
Alexander Solla
-
Andrew Coppin
-
Brandon S Allbery KF8NH
-
C. McCann
-
David Leimbach
-
Don Stewart
-
Erik de Castro Lopo
-
Felipe Lessa
-
Hamish Mackenzie
-
Ivan Lazar Miljenovic
-
Ivan S. Freitas
-
Joachim Breitner
-
John Millikin
-
Jonas Almström Duregård
-
Magnus Therning
-
Mathew de Detrich
-
Richard O'Keefe
-
Simon Farnsworth
-
Simon Marlow
-
Stephen Tetley