Re: Debian library packaging? or, how to get a single deb to work for hugs/ghc/nhc?

Shae Matijs Erisson Wrote:
In short, I'd like to see some libraries packaged for Debian. I'm especially interested in seeing the various Gui libs arrive. My question though, is how to package a lib in such a way that it works for all three of the packaged Haskell compilers?
There are related discussions on various lists. On Haskell-cafe, we talked about packaging Debian stuff: http://haskell.org/pipermail/haskell-cafe/2002-December/003708.html Also on Haskell-cafe, there was a small discussion on the state of libraries and the lack of a central repository for libraries: http://haskell.org/pipermail/haskell-cafe/2002-October/003516.html On Glasgow-haskell-users, there was a discussion about how to create a system for authors of 3rd party libraries independently of a particular compiler: http://haskell.org/pipermail/glasgow-haskell-users/2002-November/004322.html
From Simon Marlow's comments:
Suppose we had an infrastructure which someone could plug their library source code into, and immediately get
- a source tarball which will configure/build/install on a Unix/cygwin system with GHC installed, and provide a GHC package when installed.
- skeleton binary packages for RPM, Windows installer, *BSD ports, Gentoo ebuild, etc. The binary packages will require some programmer intervention (such as setting the dependencies correctly), but much of the work can be done automatically.
Though I haven't used it extensively yet, hmake seems like a good system to start with since it has nice flags like "-nhc" and "-ghc", it can be configured to use either, it has sane defaults, etc. FPTools does have a nice makefile system that 3rd party developers could adopt, though at the moment it sort of relies on those libraries being inside a ghc source directory, and I don't know how well it would work w/ nhc or Hugs. Several people have commented that the problems with Haskell libraries and compilers are similar to the problems with elisp. Maybe a solution would be to augment hmake with an idea of registering packages and {re}compiling them when new / different compilers are installed. I'm starting to like this idea a lot better than trying to make the packages depend on the particular compilers and versions. I notice that the Haskell packages sometimes get out of date, and it would be hard to get all the maintainers coordinated enough to release packages at the same time a new compiler comes out. More thoughts? peace, isaac

To introduce myself: I'm a Debian developer, and a student at Göteborg University/Chalmers (I've got Haskell in my blood ;)) and I'm currently considering packaging alex and bnfc. tis 2003-01-28 klockan 16.34 skrev Isaac Jones:
Several people have commented that the problems with Haskell libraries and compilers are similar to the problems with elisp. Maybe a solution would be to augment hmake with an idea of registering packages and {re}compiling them when new / different compilers are installed.
I have no idea how backwards compatible ghc is with respect to libraries. If the ABI changes a lot between releases, that's a problem, but if it's moderately predictable we can have dependencies on the form ghc5 (>= X.Y), ghc5 (<< X.Y+1) I don't see why that won't work. The same is done for Python compilation, pure python packages use dependencies >= 2.2, << 2.3.
I'm starting to like this idea a lot better than trying to make the packages depend on the particular compilers and versions. I notice that the Haskell packages sometimes get out of date, and it would be hard to get all the maintainers coordinated enough to release packages at the same time a new compiler comes out.
The whole idea with binary distribution is to compile things once and let others download and install the binary and be done with it. Compiling Haskell programs of moderate size on a not-very-recent computer takes quite some time, especially if you want to optimize too, and will mean a *very slow* installation procedure. That simply is unacceptable. Byte-compiling, like e.g. Python does is in my experience a lot faster. /Martin

Martin Sjögren wrote:
I have no idea how backwards compatible ghc is with respect to libraries. If the ABI changes a lot between releases, that's a problem, but if it's moderately predictable we can have dependencies on the form ghc5 (>= X.Y), ghc5 (<< X.Y+1) I don't see why that won't work. [...]
In a nutshell: It doesn't work, even if only the patchlevel of GHC changes and the user-visible ABI stays the same. As has been discussed several times, this is the price one has to pay for the heavy inter-module optimizations GHC does. And recompilation is not always an option, either, e.g. when the package in question has some native parts which rely on development stuff (headers, program generators, etc.) which is normally not installed on the target. So we're basically left with two kinds of packages, IMHO: * Pre-compiled ones, tied to a particular compiler release * Source packages for "real" developers, not just Haskell users Cheers, S.

Hi Sven and Martin. Thanks for your input...
Sven Panne
And recompilation is not always an option, either, e.g. when the package in question has some native parts which rely on development stuff (headers, program generators, etc.) which is normally not installed on the target.
Do you mean, for instance, a package requires Happy to build, but Happy might not be installed? In this case, we could make the package depend on Happy. Surely this is better than tying each package to a specific compiler release.
So we're basically left with two kinds of packages, IMHO:
* Pre-compiled ones, tied to a particular compiler release
And whenever there is a new compiler release for each compiler, all the package maintainers are going to have to recompile their packages and upload them, right? I guess this is pretty OK once stuff gets into stable, but for unstable I think packages would be broken often, and for long periods. This is actually the case with c2hs right now if I'm not mistaken.
* Source packages for "real" developers, not just Haskell users
So for these source packages, do you envision a build system that
registers and rebuilds packages depending on which version of which
compiler you have?
More below...
Martin Sjögren
The whole idea with binary distribution is to compile things once and let others download and install the binary and be done with it. Compiling Haskell programs of moderate size on a not-very-recent computer takes quite some time, especially if you want to optimize too, and will mean a *very slow* installation procedure. That simply is unacceptable. Byte-compiling, like e.g. Python does is in my experience a lot faster.
I'm not married to the idea of recompiling everything for each user, but the convenience of having debian and hmake or some build system figure things out for you seems better since we mostly have lots of smaller libraries. So taking Sven's email into account, would you also prefer to see packages tied to a specific compiler release, then each package recompiled whenever there is a new release (assuming that this would be necessary, I'm not sure I'm right about that.) But this does bring up something that I admit I hadn't really thought much about: in stable, the new compiler releases won't really be a problem, but the different compilers (ghc, nhc, hugs) would still be a problem. So am I hearing that others would prefer to have packages like: hunit-ghc5.04.2, hunit-ghc4, hunit-hugs, hunit-nhc, etc? peace, isaac

Isaac Jones
The whole idea with binary distribution is to compile things once and let others download and install the binary and be done with it. Compiling Haskell programs of moderate size on a not-very-recent computer takes quite some time, especially if you want to optimize too, and will mean a *very slow* installation procedure. That simply is unacceptable. Byte-compiling, like e.g. Python does is in my experience a lot faster.
But this does bring up something that I admit I hadn't really thought much about: in stable, the new compiler releases won't really be a problem, but the different compilers (ghc, nhc, hugs) would still be a problem.
So am I hearing that others would prefer to have packages like: hunit-ghc5.04.2, hunit-ghc4, hunit-hugs, hunit-nhc, etc?
How long does it take to compile packages on various hardware? Does anyone have numbers? I'd prefer the emacs/python compile upon install solution, since that would cut down on the number of packages. What about profiling for GHC? Each library will then require profiling versions, right? That list above of hunit packages would get longer. As you suggested in IRC, what about having source packages that can compile themselves upon installation, and binary debs for the really big libs that would take a long time to build? I don't know if that would be difficult to do for a deb. That would solve the problem of binary debs being incompatible with each minor version change in GHC. Does NHC have the same compatibility? I'll come up with some numbers for compiling libs on an Athlon 800MHz I have handy. Is that okay as an average machine? -- Shae Matijs Erisson - 2 days older than RFC0226 #haskell on irc.freenode.net - We Put the Funk in Funktion

Shae Matijs Erisson
That would solve the problem of binary debs being incompatible with each minor version change in GHC. Does NHC have the same compatibility?
Libs for nhc98 are, generally speaking, binary compatible across different minor version numbers, so the situation is rather easier than for ghc. You still need to have some version-consistency however, because the standard libraries available with the compiler (and their exact signatures) often change slightly across time. Regards, Malcolm

G'day all. On Wed, Jan 29, 2003 at 08:47:15AM +0100, Sven Panne wrote:
In a nutshell: It doesn't work, even if only the patchlevel of GHC changes and the user-visible ABI stays the same. As has been discussed several times, this is the price one has to pay for the heavy inter-module optimizations GHC does. And recompilation is not always an option, either, e.g. when the package in question has some native parts which rely on development stuff (headers, program generators, etc.) which is normally not installed on the target.
Requiring the headers/program generators/whatever isn't a big deal, IMO. Either you could make the packages depend on the appropriate development packages, or you could package dependencies separately. As for the inter-module optimisation problem, as other have noted, this would not be an issue for stable packages, since the environment that they depend on is stable. I have a suspicion that some compiler help may go some way to solving the problem in the long-run (e.g. if GHC told us what it actually used rather than what it theoretically depends on), but I don't want to go there yet. Or probably ever. Cheers, Andrew Bromage
participants (6)
-
Andrew J Bromage
-
Isaac Jones
-
Malcolm Wallace
-
Martin Sjögren
-
Shae Matijs Erisson
-
Sven Panne