
On Tue, 23 Oct 2007, John D. Ramsdell wrote:
When I first carefully read the Cabal documentation, I remember wondering why there is a limit of one on the number of libraries in a package. Reflecting on autoconf, and its AC_CONFIG_SUBDIRS macro, I also wondered why packages cannot be components of a package. If you wrote a symbolic model checker, and you want it to be used with a specific version of a binary decision diagram package, the simplest way of enforcing this restriction is by including the package within yours. This is in fact how NuSMV is distributed. (It includes CUDD.)
Having sub-packages is also one thing I need, but your example is one where I think that sub-packages are the wrong approach. Uploading the imported package (the binary decision diagram one) to hackage.haskell.org and including it (in the symbolic model checker) with the concrete version number is the better way.
Recently, when I discovered there is no direct support for linking the executables in a package with the library it defines,
You can define executables in a package which are automatically build - however Cabal will recompile the imported modules instead of using the compiled ones from the library part.
I wondered why the advice was to separate the library and the executables into two packages even though both are meant to be used and distributed as a unit.
As far as I know the future plan is to split Cabal projects into library and executable parts and then to allow also multiple library parts. This way you could provide parts for special applications or different operating systems.
Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible?
It was requested several times but it seems not to be designed and implemented so easily.

On Tue, 2007-10-23 at 22:21 +0200, Henning Thielemann wrote:
Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible?
It was requested several times but it seems not to be designed and implemented so easily.
The design choice is that the Cabal package is the unit of distribution. Of course what one sees as a system may well consist of multiple interdependent packages. The direction I think we're moving in is to try to improve our tools to make it easier for developers to work with systems that consist of multiple packages. However from the distribution and installation point of view, nothing needs to change, the package remains the unit of distribution. As far as I can see that covers all the cases where we might want "distributions", "shipments" or "sub-packages". Of course if anyone has any examples where they think our model might not cover things we should bring them up and consider them. The example I often think about is Gtk2Hs which now consists of 12 libraries, uses two different code generators, two FFI binding tools, lots of cpp and autoconf and has unified documentation, tutorials and demo code. Many of the features we have been adding to Cabal recently have been getting us closer to the stage where we can build and distribute Gtk2Hs using Cabal. Duncan

On Wed, 24 Oct 2007, Duncan Coutts wrote:
On Tue, 2007-10-23 at 22:21 +0200, Henning Thielemann wrote:
Does Cabal not support things like packages within a package simply because Haskell libraries currently are not complex enough to require such a feature, or is there a guiding design principle with which these features are incompatible?
It was requested several times but it seems not to be designed and implemented so easily.
The design choice is that the Cabal package is the unit of distribution. Of course what one sees as a system may well consist of multiple interdependent packages.
The direction I think we're moving in is to try to improve our tools to make it easier for developers to work with systems that consist of multiple packages. However from the distribution and installation point of view, nothing needs to change, the package remains the unit of distribution.
As far as I can see that covers all the cases where we might want "distributions", "shipments" or "sub-packages". Of course if anyone has any examples where they think our model might not cover things we should bring them up and consider them.
If it becomes much easier to handle multiple packages this might work. - Recompiling multiple packages must be simplified, because I use Cabal in the development phase. If one "sub-package" changes it must be simple to recompile the package set and to import modified "sub-packages" without their installation. (Because installation might overwrite valid code with buggy code.) This includes finding the right order of package compilation according to the package dependencies. (I have even some code using FGL for this task, if someone is interested.) - It must be simple to distribute and to download multiple packages that belong together. It would be nice if one could maintain several packages which share the same 'src' directory in one darcs repository, which is of course duplicated on distribution. Say Foo.cabal Hs-Source-Dirs: src Exposed-Modules: Data.Structure.Foo Bar.cabal Hs-Source-Dirs: src Exposed-Modules: Data.Structure.Bar FooDemo.cabal Hs-Source-Dirs: demo Main-Is: FooDemo.hs BarDemo.cabal Hs-Source-Dirs: demo Main-Is: BarDemo.hs src/Data/Structure/Foo.hs src/Data/Structure/Bar.hs demo/FooDemo.hs demo/BarDemo.hs

On Wed, 2007-10-24 at 14:34 +0200, Henning Thielemann wrote:
On Wed, 24 Oct 2007, Duncan Coutts wrote:
As far as I can see that covers all the cases where we might want "distributions", "shipments" or "sub-packages". Of course if anyone has any examples where they think our model might not cover things we should bring them up and consider them.
If it becomes much easier to handle multiple packages this might work. - Recompiling multiple packages must be simplified, because I use Cabal in the development phase.
Yes. We want to be able to build and use collections of packages inplace and to handle rebuilding automatically. Currently that has to be done manually by the developer which is a pain. I'd like to see Cabal do proper dependency analysis within and between packages in a source tree and build a single module dep graph and rebuild minimally (and in parallel). If this sounds an awful lot like make then that's no coincidence.
- It must be simple to distribute and to download multiple packages that belong together.
That is what cabal-install is for. It's available from hackage now although as somewhat of a preview. We'd like feedback on it, bug reports and patches. Then there are also the tools that convert cabal packages into native system packages and then we can let the native package manager handle the dependencies. For example I would imagine distributing Gtk2Hs as several .tar.gz packages on hackage. They would of course have interdependencies and most probably with very strict version ranges. One could then cabal-install the gtk package and have it pull in all of it's dependent packages, or when the distro package catches up then one could install that. My main point here is that it's perfectly possible to distribute a system as a bunch of packages. For example systems like Gnome or x.org's X windows system are distributed as 100's of individual .tar.gz packages with dependencies between them. This allows them to do incremental releases of bits of the system without having to do everything in one go.
It would be nice if one could maintain several packages which share the same 'src' directory in one darcs repository, which is of course duplicated on distribution.
Right, when you do a cabal sdist then it makes a tarball which contains all the files necessary to build that package. This may well duplicate shared files. Duncan
participants (2)
-
Duncan Coutts
-
Henning Thielemann