
When I first carefully read the Cabal documentation, I remember wondering why there is a limit of one on the number of libraries in a package. Reflecting on autoconf, and its AC_CONFIG_SUBDIRS macro, I also wondered why packages cannot be components of a package. If you wrote a symbolic model checker, and you want it to be used with a specific version of a binary decision diagram package, the simplest way of enforcing this restriction is by including the package within yours. This is in fact how NuSMV is distributed. (It includes CUDD.) Recently, when I discovered there is no direct support for linking the executables in a package with the library it defines, I wondered why the advice was to separate the library and the executables into two packages even though both are meant to be used and distributed as a unit. Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible? John

On Tue, 23 Oct 2007, John D. Ramsdell wrote:
When I first carefully read the Cabal documentation, I remember wondering why there is a limit of one on the number of libraries in a package. Reflecting on autoconf, and its AC_CONFIG_SUBDIRS macro, I also wondered why packages cannot be components of a package. If you wrote a symbolic model checker, and you want it to be used with a specific version of a binary decision diagram package, the simplest way of enforcing this restriction is by including the package within yours. This is in fact how NuSMV is distributed. (It includes CUDD.)
Having sub-packages is also one thing I need, but your example is one where I think that sub-packages are the wrong approach. Uploading the imported package (the binary decision diagram one) to hackage.haskell.org and including it (in the symbolic model checker) with the concrete version number is the better way.
Recently, when I discovered there is no direct support for linking the executables in a package with the library it defines,
You can define executables in a package which are automatically build - however Cabal will recompile the imported modules instead of using the compiled ones from the library part.
I wondered why the advice was to separate the library and the executables into two packages even though both are meant to be used and distributed as a unit.
As far as I know the future plan is to split Cabal projects into library and executable parts and then to allow also multiple library parts. This way you could provide parts for special applications or different operating systems.
Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible?
It was requested several times but it seems not to be designed and implemented so easily.

On Tue, 2007-10-23 at 22:21 +0200, Henning Thielemann wrote:
Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible?
It was requested several times but it seems not to be designed and implemented so easily.
The design choice is that the Cabal package is the unit of distribution. Of course what one sees as a system may well consist of multiple interdependent packages. The direction I think we're moving in is to try to improve our tools to make it easier for developers to work with systems that consist of multiple packages. However from the distribution and installation point of view, nothing needs to change, the package remains the unit of distribution. As far as I can see that covers all the cases where we might want "distributions", "shipments" or "sub-packages". Of course if anyone has any examples where they think our model might not cover things we should bring them up and consider them. The example I often think about is Gtk2Hs which now consists of 12 libraries, uses two different code generators, two FFI binding tools, lots of cpp and autoconf and has unified documentation, tutorials and demo code. Many of the features we have been adding to Cabal recently have been getting us closer to the stage where we can build and distribute Gtk2Hs using Cabal. Duncan

On Wed, 24 Oct 2007, Duncan Coutts wrote:
On Tue, 2007-10-23 at 22:21 +0200, Henning Thielemann wrote:
Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible?
It was requested several times but it seems not to be designed and implemented so easily.
The design choice is that the Cabal package is the unit of distribution. Of course what one sees as a system may well consist of multiple interdependent packages.
The direction I think we're moving in is to try to improve our tools to make it easier for developers to work with systems that consist of multiple packages. However from the distribution and installation point of view, nothing needs to change, the package remains the unit of distribution.
As far as I can see that covers all the cases where we might want "distributions", "shipments" or "sub-packages". Of course if anyone has any examples where they think our model might not cover things we should bring them up and consider them.
If it becomes much easier to handle multiple packages this might work. - Recompiling multiple packages must be simplified, because I use Cabal in the development phase. If one "sub-package" changes it must be simple to recompile the package set and to import modified "sub-packages" without their installation. (Because installation might overwrite valid code with buggy code.) This includes finding the right order of package compilation according to the package dependencies. (I have even some code using FGL for this task, if someone is interested.) - It must be simple to distribute and to download multiple packages that belong together. It would be nice if one could maintain several packages which share the same 'src' directory in one darcs repository, which is of course duplicated on distribution. Say Foo.cabal Hs-Source-Dirs: src Exposed-Modules: Data.Structure.Foo Bar.cabal Hs-Source-Dirs: src Exposed-Modules: Data.Structure.Bar FooDemo.cabal Hs-Source-Dirs: demo Main-Is: FooDemo.hs BarDemo.cabal Hs-Source-Dirs: demo Main-Is: BarDemo.hs src/Data/Structure/Foo.hs src/Data/Structure/Bar.hs demo/FooDemo.hs demo/BarDemo.hs

On Wed, 2007-10-24 at 14:34 +0200, Henning Thielemann wrote:
On Wed, 24 Oct 2007, Duncan Coutts wrote:
As far as I can see that covers all the cases where we might want "distributions", "shipments" or "sub-packages". Of course if anyone has any examples where they think our model might not cover things we should bring them up and consider them.
If it becomes much easier to handle multiple packages this might work. - Recompiling multiple packages must be simplified, because I use Cabal in the development phase.
Yes. We want to be able to build and use collections of packages inplace and to handle rebuilding automatically. Currently that has to be done manually by the developer which is a pain. I'd like to see Cabal do proper dependency analysis within and between packages in a source tree and build a single module dep graph and rebuild minimally (and in parallel). If this sounds an awful lot like make then that's no coincidence.
- It must be simple to distribute and to download multiple packages that belong together.
That is what cabal-install is for. It's available from hackage now although as somewhat of a preview. We'd like feedback on it, bug reports and patches. Then there are also the tools that convert cabal packages into native system packages and then we can let the native package manager handle the dependencies. For example I would imagine distributing Gtk2Hs as several .tar.gz packages on hackage. They would of course have interdependencies and most probably with very strict version ranges. One could then cabal-install the gtk package and have it pull in all of it's dependent packages, or when the distro package catches up then one could install that. My main point here is that it's perfectly possible to distribute a system as a bunch of packages. For example systems like Gnome or x.org's X windows system are distributed as 100's of individual .tar.gz packages with dependencies between them. This allows them to do incremental releases of bits of the system without having to do everything in one go.
It would be nice if one could maintain several packages which share the same 'src' directory in one darcs repository, which is of course duplicated on distribution.
Right, when you do a cabal sdist then it makes a tarball which contains all the files necessary to build that package. This may well duplicate shared files. Duncan

On Wed, 2007-10-24 at 09:03 -0400, John D. Ramsdell wrote:
Duncan Coutts
writes: The design choice is that the Cabal package is the unit of distribution. Of course what one sees as a system may well consist of multiple interdependent packages.
I think developers want support for developing and distributing systems as a unit.
I think I'm claiming that developers want support for developing systems as a unit but distribution can be as a collection of components rather than as a unit. Then using a package manager or cabal-install a user can pull in the components they need, including pulling in their dependencies automatically. Duncan

Duncan Coutts
I think I'm claiming that developers want support for developing systems as a unit but distribution can be as a collection of components rather than as a unit.
Do you believe it is a bad idea to expect that Cabal be the appropriate tool for developing systems as a unit? Maybe Cabal should do just one thing, and one thing well, package components of systems into the smallest units available for distribution. I had been thinking of Cabal as a development tool as well as a packaging tool, similar to autoconf/automake, but perhaps Cabal is really meant to be similar to rpmbuild and dpkg-buildpackage. Maybe thinking of it as a development tool was an error on my part. For my application, I suppose I could use autoconf/automake to develop the software, install the executables, build a Cabal package description, and then use it with Cabal to make the library available as a package. Is this usage more inline with your intentions? John

On Wed, 2007-10-24 at 13:08 -0400, John D. Ramsdell wrote:
Duncan Coutts
writes: I think I'm claiming that developers want support for developing systems as a unit but distribution can be as a collection of components rather than as a unit.
Do you believe it is a bad idea to expect that Cabal be the appropriate tool for developing systems as a unit?
No.
Maybe Cabal should do just one thing, and one thing well, package components of systems into the smallest units available for distribution.
I had been thinking of Cabal as a development tool as well as a packaging tool, similar to autoconf/automake, but perhaps Cabal is really meant to be similar to rpmbuild and dpkg-buildpackage. Maybe thinking of it as a development tool was an error on my part.
Currently it only helps with a single package at a time, so dealing with multi-package systems is cumbersome. The intention is to extend it to help developers working with multi-package systems. My claim is that all that requires is improvements in the tools, not a change in the notion of a package as the unit of distribution and dependency.
For my application, I suppose I could use autoconf/automake to develop the software, install the executables, build a Cabal package description, and then use it with Cabal to make the library available as a package. Is this usage more inline with your intentions?
We'd like to kill off autoconf as much as possible. :-) So we hope that developers will work directly with Cabal rather than generating Cabal packages from other systems. As I said before, my personal example is Gtk2Hs which is a relatively complex multi-package system. Currently it is impossible to package it with Cabal but we're getting closer. Gtk2Hs currently uses autoconf/automake and I can build everything with: ./configure make and I can rebuild everything with just make. And if there's nothing to do it takes a fraction of a second to work this out. I'd like to get to that stage with cabal. I'd like to say: cabal configure cabal build Or maybe just cabal build to get default configure parameters. And if I run cabal sdist I'd expect it to generate twelve .tar.gz packages that I can upload to hackage. With that kind of infrastructure there's lots of fun stuff we could do, like continuous build and test: cabal build --continuous & To get there of course we'll need lots of help from lots of people. I'd like to encourage people to get involved. Duncan

Henning Thielemann
Having sub-packages is also one thing I need, but your example is one where I think that sub-packages are the wrong approach. Uploading the imported package (the binary decision diagram one) to hackage.haskell.org and including it (in the symbolic model checker) with the concrete version number is the better way.
The developer in my hypothetical symbolic model checker example decided to include a specific version of a binary decision diagram package within his system, just as the developers of NuSMV decided to incorporate CUDD into their distribution. You cannot second guess his/her decision. You should assume the developer knows about hackage.haskell.org, and chose not to use it, say for legal reasons. I cannot help but observe that I have received several answers on this list that show a distrust in wishes of Cabal users. Instead of addressing the needs given by the question, some of the answers are of the form: you shouldn't be doing that, followed by a tortured method for cramming the problem into the existing set of abstractions. The point of the questions is to get people to reassess the set of abstractions.
Recently, when I discovered there is no direct support for linking the executables in a package with the library it defines,
You can define executables in a package which are automatically build - however Cabal will recompile the imported modules instead of using the compiled ones from the library part.
In a previous note, I showed how to get around this deficiency in Cabal using hook functions. With more use, I have since refined the code I sent to the list. If anyone wants the improved version, I'll send it to you. By the way, I think it is a tribute to the Cabal design, and the hook design in particular, that it was so easy for me to fix the problem using public interfaces, and just a small amount of code and effort.
I wondered why the advice was to separate the library and the executables into two packages even though both are meant to be used and distributed as a unit.
As far as I know the future plan is to split Cabal projects into library and executable parts and then to allow also multiple library parts. This way you could provide parts for special applications or different operating systems.
Please do not drop support for packaging a library with executables build on top of the library. The hook function hack noted above allows me to make it so that Cabal serves my needs, and one package can be used to distribute my system as a single unit.
Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible?
It was requested several times but it seems not to be designed and implemented so easily.
I bet it's easier than you think. You just have to dynamically generate package configuration files, just as I did to resolve my linking problem. John

On Wed, 24 Oct 2007, John D. Ramsdell wrote:
Henning Thielemann
writes: It was requested several times but it seems not to be designed and implemented so easily.
I bet it's easier than you think. You just have to dynamically generate package configuration files, just as I did to resolve my linking problem.
I think that this solution was already discussed. It has the drawback that I can no longer be sure what's actually in an installed package. If I have the package dependency foobar=1.0 in my Cabal file then I must be sure that foobar-1.0 on every user machine will contain all the modules in the same state as I have installed on my machine. If Cabal packages are mapped 1:1 to Debian or RPM packages then it would be really cumbersome to have to split up a project to one Cabal package per executable. Otherwise it's ok for me to handle executables in separate units ("sub-packages" or just "packages").

Henning Thielemann
On Wed, 24 Oct 2007, John D. Ramsdell wrote:
Henning Thielemann
writes: It was requested several times but it seems not to be designed and implemented so easily.
I bet it's easier than you think. You just have to dynamically generate package configuration files, just as I did to resolve my linking problem.
I think that this solution was already discussed. It has the drawback that I can no longer be sure what's actually in an installed package. ...
For the NuSMV motivated case, subpackages are never installed. You use them only to build the libraries and executables that are exported by the top-level package, and thus one is always sure what is installed by a package, and that subpackages cannot cause installation conflicts. Of course your point was aimed at more general cases, ones that aren't solved by the simple suggestion I made. I'll try to come up with a suggestion that handles more cases. John
participants (4)
-
Ashley Yakeley
-
Duncan Coutts
-
Henning Thielemann
-
ramsdell@mitre.org