
Hi, it's still a little work to do, but I think I'll be able to try and use Cabal configurations some time next week. Do you guys have any ideas what we could use as a good test case? The goal is to find out how well the current scheme applies to actual use cases (and if there are serious performance problems to await). / Thomas

Thomas Schilling wrote:
it's still a little work to do, but I think I'll be able to try and use Cabal configurations some time next week. Do you guys have any ideas what we could use as a good test case?
The goal is to find out how well the current scheme applies to actual use cases (and if there are serious performance problems to await).
Some suggestions: - the base package has a lot of goop in its Setup script, I really hope that all, or at least most, of it can be done using configurations - we have a few packages that want to do conditional dependencies. e.g. HGL wants to depend on either Win32 or X11. - IIRC, gtk2hs has a complex structure that will need a lot of conditional stuff in its .cabal file. Duncan will tell you more. - take a look at the old discussion on libraries@haskell.org; there were lots of use cases discussed there. Cheers, Simon

On Thu, 2007-06-14 at 11:18 +0100, Simon Marlow wrote:
Thomas Schilling wrote:
it's still a little work to do, but I think I'll be able to try and use Cabal configurations some time next week. Do you guys have any ideas what we could use as a good test case?
The goal is to find out how well the current scheme applies to actual use cases (and if there are serious performance problems to await).
Some suggestions:
- the base package has a lot of goop in its Setup script, I really hope that all, or at least most, of it can be done using configurations
- we have a few packages that want to do conditional dependencies. e.g. HGL wants to depend on either Win32 or X11.
- IIRC, gtk2hs has a complex structure that will need a lot of conditional stuff in its .cabal file. Duncan will tell you more.
Unfortunately there's still a lot of work before Gtk2Hs is ready to be cabalised. Cabal configurations is a major piece of the puzzle though. But sadly there are too many other bits before Gtk2Hs would be a suitable test case for configurations. A cut down model of Gtk2Hs might work though, ie a bunch of .cabal files modelling the various bits of Gtk2Hs, just without any of the actual source code.
- take a look at the old discussion on libraries@haskell.org; there were lots of use cases discussed there.
Many related to fps/bytestring being included in the base package or not. Duncan

On 14 jun 2007, at 19.27, Duncan Coutts wrote:
On Thu, 2007-06-14 at 11:18 +0100, Simon Marlow wrote:
Thomas Schilling wrote:
it's still a little work to do, but I think I'll be able to try and use Cabal configurations some time next week. Do you guys have any ideas what we could use as a good test case?
The goal is to find out how well the current scheme applies to actual use cases (and if there are serious performance problems to await).
Some suggestions:
- the base package has a lot of goop in its Setup script, I really hope that all, or at least most, of it can be done using configurations
- we have a few packages that want to do conditional dependencies. e.g. HGL wants to depend on either Win32 or X11.
- IIRC, gtk2hs has a complex structure that will need a lot of conditional stuff in its .cabal file. Duncan will tell you more.
Unfortunately there's still a lot of work before Gtk2Hs is ready to be cabalised. Cabal configurations is a major piece of the puzzle though. But sadly there are too many other bits before Gtk2Hs would be a suitable test case for configurations.
A cut down model of Gtk2Hs might work though, ie a bunch of .cabal files modelling the various bits of Gtk2Hs, just without any of the actual source code.
- take a look at the old discussion on libraries@haskell.org; there were lots of use cases discussed there.
Many related to fps/bytestring being included in the base package or not.
Is there some centralized documentation of these changes? Or should I just browse through the mailing lists? E.g., it looks like GHC doesn't have a .cabal file yet, so I'd have to manually translate to the Makefile, which I presume would be a major undertaking. Same applies for gtk2hs, whose ugliness I had to experience earlier. / Thomas

On Wed, Jun 13, 2007 at 12:48:17AM +0200, Thomas Schilling wrote:
it's still a little work to do, but I think I'll be able to try and use Cabal configurations some time next week. Do you guys have any ideas what we could use as a good test case?
The goal is to find out how well the current scheme applies to actual use cases (and if there are serious performance problems to await).
As Simon mentioned, there's HGL depending on either unix or Win32. The fps package was incorporated into base-2.0, so packages like binary, bzlib, zlib, darcs-graph or hmp3 could depend on base >= 2.0 or (base < 2.0 and fps). The html package was split off from base-2.0. For this reason HAppS has two variants of its Cabal file, which could be combined under configurations. Similarly lambdaFeed could depend on base < 2.0 or (base >= 2.0 and html). The above are all in HackageDB. The HEAD has a few more examples: Several packages are split off from base (but its version number hasn't been incremented yet, so you can't use that). The process package has a Setup.hs that exists only to drop the System.Process module for implementations other than GHC. The time package has a Setup.hs that adds a dependency on Win32 if the platform is Windows.

On Jun 15, 2007, at 1:26 , Ross Paterson wrote:
On Wed, Jun 13, 2007 at 12:48:17AM +0200, Thomas Schilling wrote:
it's still a little work to do, but I think I'll be able to try and use Cabal configurations some time next week. Do you guys have any ideas what we could use as a good test case?
The goal is to find out how well the current scheme applies to actual use cases (and if there are serious performance problems to await).
As Simon mentioned, there's HGL depending on either unix or Win32.
The fps package was incorporated into base-2.0, so packages like binary, bzlib, zlib, darcs-graph or hmp3 could depend on base >= 2.0 or (base < 2.0 and fps).
The html package was split off from base-2.0. For this reason HAppS has two variants of its Cabal file, which could be combined under configurations. Similarly lambdaFeed could depend on base < 2.0 or (base >= 2.0 and html).
The above are all in HackageDB. The HEAD has a few more examples:
Several packages are split off from base (but its version number hasn't been incremented yet, so you can't use that).
The process package has a Setup.hs that exists only to drop the System.Process module for implementations other than GHC.
The time package has a Setup.hs that adds a dependency on Win32 if the platform is Windows.
The unix-compat package depends on the unix package when not compiled on Windows, using Setup.lhs and CPP hacks. More packages with the usual base/fps thing: cgi, fastcgi, tar, htar, hope. /Björn

On Fri, Jun 15, 2007 at 12:26:36AM +0100, Ross Paterson wrote:
The process package has a Setup.hs that exists only to drop the System.Process module for implementations other than GHC.
That's not something that we want to be able to fix with configurations, though. We need to either implement all the functionality for all the implementations or to split the package up. Thanks Ian

Hi Ian,
That's not something that we want to be able to fix with configurations, though. We need to either implement all the functionality for all the implementations or to split the package up.
I disagree. I have a library which provides two modules, one for compilers without multi-parameter type classes, and both for those with them. I want this to be one library, but I want the modules exported to vary based on the compiler being used. Since the library is small, splitting it up would be a big pain. Thanks Neil

On Fri, Jun 15, 2007 at 12:56:41PM +0100, Neil Mitchell wrote:
That's not something that we want to be able to fix with configurations, though. We need to either implement all the functionality for all the implementations or to split the package up.
I disagree. I have a library which provides two modules, one for compilers without multi-parameter type classes, and both for those with them. I want this to be one library, but I want the modules exported to vary based on the compiler being used. Since the library is small, splitting it up would be a big pain.
I thought we'd all agreed that a library package should always export the same modules (and class, functions, type signatures etc), so that if foo depends on bar and you have bar installed then you know that you can build foo. In fact, I thought you were one of the people arguing in favour of this for the base package! Thanks Ian

Hi
I thought we'd all agreed that a library package should always export the same modules (and class, functions, type signatures etc), so that if foo depends on bar and you have bar installed then you know that you can build foo.
In fact, I thought you were one of the people arguing in favour of this for the base package!
I was, for the base package. I want other people to obey these rules, but occasionally I want to violate them :) I'm not sure if in general it should be possible to change the export list. Perhaps we can rely on a large number of evil stares to stop this being common practice, and yet permit it occasionally. One thing I would like is given a package _data, which provides a data type, and a package _class which provides a class and various instances, I'd like to write in the _class cabal file: #if has _data module Class.InstanceForData #endif Perhaps configurations can support that? Thanks Neil

On 15 jun 2007, at 14.09, Neil Mitchell wrote:
Hi
I thought we'd all agreed that a library package should always export the same modules (and class, functions, type signatures etc), so that if foo depends on bar and you have bar installed then you know that you can build foo.
In fact, I thought you were one of the people arguing in favour of this for the base package!
I was, for the base package. I want other people to obey these rules, but occasionally I want to violate them :)
I'm not sure if in general it should be possible to change the export list. Perhaps we can rely on a large number of evil stares to stop this being common practice, and yet permit it occasionally.
One thing I would like is given a package _data, which provides a data type, and a package _class which provides a class and various instances, I'd like to write in the _class cabal file:
#if has _data module Class.InstanceForData #endif
Perhaps configurations can support that?
You mean like this? Name: demo Cabal-version: >= 1.3 Description: This is a test file with a description longer than two lines. flag Debug { Description: Enable debug information Default: False } library { build-depends: blub exposed-modules: Demo.Main, Demo if flag(debug) { build-depends: hunit ghc-options: -DDEBUG exposed-modules: Demo.Internal } } executable foo-bar { Main-is: Foo.hs } Of course in this case we'd need to add some flags to the version number, to indicate, that tho installed version has a certain feature enabled. This could be solved with a different package name of course, but I think it should be easy to add tags to a package version number: if flag(debug) { build-depends: hunit ghc-options: -DDEBUG tag: debug exposed-modules: Demo.Internal } Testing for the tag is (almost) already supported: build-depends: demo >= 1.1-debug, It should be used often, but it could if necessary.

Hi
Yes, that's much better than what I was thinking! It also allows me to
write my library and provide (or not) the flag when using GHC - so I
can explicitly choose whether to require the extra features or not.
Thanks
Neil
On 6/15/07, Thomas Schilling
On 15 jun 2007, at 14.09, Neil Mitchell wrote:
Hi
I thought we'd all agreed that a library package should always export the same modules (and class, functions, type signatures etc), so that if foo depends on bar and you have bar installed then you know that you can build foo.
In fact, I thought you were one of the people arguing in favour of this for the base package!
I was, for the base package. I want other people to obey these rules, but occasionally I want to violate them :)
I'm not sure if in general it should be possible to change the export list. Perhaps we can rely on a large number of evil stares to stop this being common practice, and yet permit it occasionally.
One thing I would like is given a package _data, which provides a data type, and a package _class which provides a class and various instances, I'd like to write in the _class cabal file:
#if has _data module Class.InstanceForData #endif
Perhaps configurations can support that?
You mean like this?
Name: demo Cabal-version: >= 1.3
Description: This is a test file with a description longer than two lines.
flag Debug { Description: Enable debug information Default: False }
library { build-depends: blub exposed-modules: Demo.Main, Demo
if flag(debug) { build-depends: hunit ghc-options: -DDEBUG exposed-modules: Demo.Internal } }
executable foo-bar { Main-is: Foo.hs }
Of course in this case we'd need to add some flags to the version number, to indicate, that tho installed version has a certain feature enabled.
This could be solved with a different package name of course, but I think it should be easy to add tags to a package version number:
if flag(debug) { build-depends: hunit ghc-options: -DDEBUG tag: debug exposed-modules: Demo.Internal }
Testing for the tag is (almost) already supported:
build-depends: demo >= 1.1-debug,
It should be used often, but it could if necessary.

On Fri, Jun 15, 2007 at 03:06:07PM +0200, Thomas Schilling wrote:
On 15 jun 2007, at 14.09, Neil Mitchell wrote:
this being common practice, and yet permit it occasionally.
One thing I would like is given a package _data, which provides a data type, and a package _class which provides a class and various instances, I'd like to write in the _class cabal file:
#if has _data module Class.InstanceForData #endif
Perhaps configurations can support that?
This has the same problem, in that different versions of the same package provide different things. Also, if you install _class first and _data second then you don't get Class.InstanceForData, whereas installing them the other way round you do.
Of course in this case we'd need to add some flags to the version number, to indicate, that tho installed version has a certain feature enabled.
This could be solved with a different package name of course, but I think it should be easy to add tags to a package version number:
if flag(debug) { build-depends: hunit ghc-options: -DDEBUG tag: debug exposed-modules: Demo.Internal }
One thing I have considered proposing is rather than just exposed-modules: ... other-modules: ... we could have (I'm not particularly advocating this syntax): modules: ... modules[testing]: ... modules[internal]: ... (you can put any string inside [...]) and you can then, in another package, say depends: foo[testing,internal]
Testing for the tag is (almost) already supported:
build-depends: demo >= 1.1-debug,
IIRC tags are designed to be ignored when comparing version numbers, so this also has the same problem. Thanks Ian

Ian Lynagh wrote:
On Fri, Jun 15, 2007 at 03:06:07PM +0200, Thomas Schilling wrote:
On 15 jun 2007, at 14.09, Neil Mitchell wrote:
this being common practice, and yet permit it occasionally.
One thing I would like is given a package _data, which provides a data type, and a package _class which provides a class and various instances, I'd like to write in the _class cabal file:
#if has _data module Class.InstanceForData #endif
Perhaps configurations can support that?
This has the same problem, in that different versions of the same package provide different things.
Also, if you install _class first and _data second then you don't get Class.InstanceForData, whereas installing them the other way round you do.
Of course in this case we'd need to add some flags to the version number, to indicate, that tho installed version has a certain feature enabled.
This could be solved with a different package name of course, but I think it should be easy to add tags to a package version number:
if flag(debug) { build-depends: hunit ghc-options: -DDEBUG tag: debug exposed-modules: Demo.Internal }
One thing I have considered proposing is rather than just
exposed-modules: ... other-modules: ...
we could have (I'm not particularly advocating this syntax):
modules: ... modules[testing]: ... modules[internal]: ...
A simpler way to solve this problem is to have two packages, with the first package (foo-internal) exporting all the modules, and the second (foo) re-exposing just the non-internal modules. We don't currently have support for re-exposing, but it has lots of uses and it shouldn't be too hard to add (to GHC, at least). Perhaps it's not quite as nice, though: the foo-internal package shows up in your ghc-pkg list, and you need two separate Cabal packages (although good support for working with multiple packages is something we should have too). Cheers, Simon

On Mon, Jun 18, 2007 at 09:16:10AM +0100, Simon Marlow wrote:
modules: ... modules[testing]: ... modules[internal]: ...
A simpler way to solve this problem is to have two packages, with the first package (foo-internal) exporting all the modules, and the second (foo) re-exposing just the non-internal modules. We don't currently have support for re-exposing, but it has lots of uses and it shouldn't be too hard to add (to GHC, at least).
Perhaps it's not quite as nice, though: the foo-internal package shows up in your ghc-pkg list, and you need two separate Cabal packages (although good support for working with multiple packages is something we should have too).
It also means that none of your internal modules can depend on any of your normal modules. (I'm not really convinced it's simpler either) Thanks Ian

Ian Lynagh wrote:
On Mon, Jun 18, 2007 at 09:16:10AM +0100, Simon Marlow wrote:
modules: ... modules[testing]: ... modules[internal]: ... A simpler way to solve this problem is to have two packages, with the first package (foo-internal) exporting all the modules, and the second (foo) re-exposing just the non-internal modules. We don't currently have support for re-exposing, but it has lots of uses and it shouldn't be too hard to add (to GHC, at least).
Perhaps it's not quite as nice, though: the foo-internal package shows up in your ghc-pkg list, and you need two separate Cabal packages (although good support for working with multiple packages is something we should have too).
It also means that none of your internal modules can depend on any of your normal modules.
Sure they can - the foo-internal package would contain all the modules (internal + external), but the foo package would only re-expose some of them.
(I'm not really convinced it's simpler either)
Simpler in the sense of not adding new syntax and functionality to Cabal (well ok, it does add some functionality, because Cabal needs to know which modules are being re-exposed). Cheers, Simon

On Mon, Jun 18, 2007 at 01:51:06PM +0100, Simon Marlow wrote:
It also means that none of your internal modules can depend on any of your normal modules.
Sure they can - the foo-internal package would contain all the modules (internal + external), but the foo package would only re-expose some of them.
Sorry, yes, my brain broke. Thanks Ian
participants (7)
-
Bjorn Bringert
-
Duncan Coutts
-
Ian Lynagh
-
Neil Mitchell
-
Ross Paterson
-
Simon Marlow
-
Thomas Schilling