Re: [Haskell] Re: Trying to install binary-0.4

Duncan and I have started a wiki page to collect proposals for ways to avoid or alleviate the pain from future package reorganisations. http://hackage.haskell.org/trac/ghc/wiki/PackageCompatibility It's been helpful for me to write all this down, the issues seem much clearer. However, I don't see an obviously best solution. For me proposal 4.2 (see the wiki page) looks the most promising, but it doesn't provide complete backwards compatibility, so I imagine there will be people who disagree. Please read the page. Fix problems, add rationale, add proposals if you have any that are substantially different from those already there. For general discussion just reply to this message (replies directed to libraries@haskell.org). Cheers, Simon

Simon Marlow
http://hackage.haskell.org/trac/ghc/wiki/PackageCompatibility
It's been helpful for me to write all this down, the issues seem much clearer. However, I don't see an obviously best solution. For me proposal 4.2 (see the wiki page) looks the most promising, but it doesn't provide complete backwards compatibility, so I imagine there will be people who disagree.
The section "The problem of lax version dependencies" refers to "Solution 3", which doesn't seem to exists. Presumably "Solution 2" is meant? Wouldn't keeping "base-2.0" and rebranding "base-3.0" to "foundation-1.0" (which might be listed as "Solution 2.1") solve this issue? -k -- If I haven't seen further, it is by standing in the footprints of giants

On Thu, Oct 25, 2007 at 12:31:05PM +0200, Ketil Malde wrote:
Wouldn't keeping "base-2.0" and rebranding "base-3.0" to "foundation-1.0" (which might be listed as "Solution 2.1") solve this issue?
We don't want to have to invent a new name for base every release. Thanks Ian

Ian Lynagh
On Thu, Oct 25, 2007 at 12:31:05PM +0200, Ketil Malde wrote:
Wouldn't keeping "base-2.0" and rebranding "base-3.0" to "foundation-1.0" (which might be listed as "Solution 2.1") solve this issue?
We don't want to have to invent a new name for base every release.
But you do want to keep reshuffling the contents of base every release? I'm not sure it is stated plainly anywhere, but *I* think one goal of having a system of packages and versions is that it should be possible to compile old code on a new system. As soon as major (i.e. API) version numbers are manadatory for the dependencies in cabal files, bumping the version will be sufficient. The current situation is unfortunately that most libraries just specify 'base' and hope that will just continue to work. -k -- If I haven't seen further, it is by standing in the footprints of giants

On Thu, Oct 25, 2007 at 03:33:31PM +0200, Ketil Malde wrote:
Ian Lynagh
writes: On Thu, Oct 25, 2007 at 12:31:05PM +0200, Ketil Malde wrote:
Wouldn't keeping "base-2.0" and rebranding "base-3.0" to "foundation-1.0" (which might be listed as "Solution 2.1") solve this issue?
We don't want to have to invent a new name for base every release.
But you do want to keep reshuffling the contents of base every release?
Certainly the next one; I'd be surprised if not the one after. Thanks Ian

Ketil Malde wrote:
Simon Marlow
writes: http://hackage.haskell.org/trac/ghc/wiki/PackageCompatibility
It's been helpful for me to write all this down, the issues seem much clearer. However, I don't see an obviously best solution. For me proposal 4.2 (see the wiki page) looks the most promising, but it doesn't provide complete backwards compatibility, so I imagine there will be people who disagree.
The section "The problem of lax version dependencies" refers to "Solution 3", which doesn't seem to exists. Presumably "Solution 2" is meant?
Wouldn't keeping "base-2.0" and rebranding "base-3.0" to "foundation-1.0" (which might be listed as "Solution 2.1") solve this issue?
This is actually what proposal 4.2 is about. But note that in order to keep base-2.0 around you need to either compile up a complete copy of it and all the packages that depend on it (proposal 2, a non-starter IMO), or allow package re-exports (proposal 4). Maintaining an exact replica of the base-2.0 API is impractical (proposal 4.1), so what we'd actually have is base-3.0 (proposal 4.2). Cheers, Simon

Simon Marlow
Wouldn't keeping "base-2.0" and rebranding "base-3.0" to "foundation-1.0" (which might be listed as "Solution 2.1") solve this issue?
This is actually what proposal 4.2 is about.
Oh, I didn't catch that - I guess I didn't (and don't) see the connectiong to 4 (re-exporting of modules from other packages), nor does 4.2 seem to say anything about naming base differently. Looks like an orthogonal issue to me. Am I missing something?
But note that in order to keep base-2.0 around you need to either compile up a complete copy of it and all the packages that depend on it (proposal 2, a non-starter IMO),
Perhaps the document could elaborate why it is a non-starter? -k -- If I haven't seen further, it is by standing in the footprints of giants

Ketil Malde wrote:
Simon Marlow
writes: Wouldn't keeping "base-2.0" and rebranding "base-3.0" to "foundation-1.0" (which might be listed as "Solution 2.1") solve this issue?
This is actually what proposal 4.2 is about.
Oh, I didn't catch that - I guess I didn't (and don't) see the connectiong to 4 (re-exporting of modules from other packages), nor does 4.2 seem to say anything about naming base differently.
Looks like an orthogonal issue to me. Am I missing something?
Probably it could be made clearer. In 4.2 the idea is that instead of replacing base-2.0 ==> base-3.0 + directory-1.0 + array-1.0 + ... you would replace base-2.0 ==> newbase-1.0 + directory-1.0 + array-1.0 + ... and additionally have a package base-3.0 that re-exports the whole of (newbase + directory + array + ...). An alternative to this last step, and I think what you had in mind, is to provide the original base-2.0; this is what proposal (2) talks about.
But note that in order to keep base-2.0 around you need to either compile up a complete copy of it and all the packages that depend on it (proposal 2, a non-starter IMO),
Perhaps the document could elaborate why it is a non-starter?
This is the "functional" solution: keep old versions of everything and GC them later. It's feasible for Nix to do this, indeed it requires a system like Nix to even make this work, because most OS packaging systems have no notion of the difference between "process-2.0 compiled against base-2.0" as distinct from "process-2.0 compiled against base-3.0", and neither does GHC's packaging system or Cabal. Also, this doesn't solve all the problems - if you need to use two packages, one of which only compiles against base-2.0 and the other only compiles against base-3.0, then in order to use them both in a program you almost certainly still have to modify one of them. The reason being that the types provided by base-2.0 (e.g. Bool) would be incompatible with those provided by base-3.0. Thinking about this is really making my head hurt, I need to go back to doing something easy like writing Haskell :-) Cheers, Simon

Simon Marlow wrote:
Probably it could be made clearer. In 4.2 the idea is that instead of replacing
base-2.0 ==> base-3.0 + directory-1.0 + array-1.0 + ...
you would replace
base-2.0 ==> newbase-1.0 + directory-1.0 + array-1.0 + ...
and additionally have a package base-3.0 that re-exports the whole of (newbase + directory + array + ...).
"Macros" in cabal: Why not just say that depending on base-3.0 actually means that you have to depend on newbase-1.0, and directory-1.0, etc... Why is compiler support needed? is it really possible that I still don't understand? Isaac

On Sun, 2007-10-28 at 13:49 -0400, Isaac Dupree wrote:
Simon Marlow wrote:
Probably it could be made clearer. In 4.2 the idea is that instead of replacing
base-2.0 ==> base-3.0 + directory-1.0 + array-1.0 + ...
you would replace
base-2.0 ==> newbase-1.0 + directory-1.0 + array-1.0 + ...
and additionally have a package base-3.0 that re-exports the whole of (newbase + directory + array + ...).
"Macros" in cabal: Why not just say that depending on base-3.0 actually means that you have to depend on newbase-1.0, and directory-1.0, etc... Why is compiler support needed? is it really possible that I still don't understand?
You'd still have the problem that every package has to specify this "macro" for itself. You'd want some global macro-database to avoid this. The better solution would be to just have a package that re-exports everything. I.e., the definition of package base-2.0 would look something like this: if has system has base-3.0, directory, array ... re-export else exposed-modules: Data.Maybe, Data.List, ...

Thomas Schilling wrote:
On Sun, 2007-10-28 at 13:49 -0400, Isaac Dupree wrote:
Simon Marlow wrote:
Probably it could be made clearer. In 4.2 the idea is that instead of replacing
base-2.0 ==> base-3.0 + directory-1.0 + array-1.0 + ...
you would replace
base-2.0 ==> newbase-1.0 + directory-1.0 + array-1.0 + ...
and additionally have a package base-3.0 that re-exports the whole of (newbase + directory + array + ...). "Macros" in cabal: Why not just say that depending on base-3.0 actually means that you have to depend on newbase-1.0, and directory-1.0, etc... Why is compiler support needed? is it really possible that I still don't understand?
You'd still have the problem that every package has to specify this "macro" for itself. You'd want some global macro-database to avoid this. The better solution would be to just have a package that re-exports everything. I.e., the definition of package base-2.0 would look something like this:
if has system has base-3.0, directory, array ... re-export else exposed-modules: Data.Maybe, Data.List, ...
Of course it would be defined in the same place that base-3.0 would normally be defined if it existed (hypothetically supposing base was renamed to newbase in this case). Instead of Haskell code, a cabal file that just says, in order to depend on this "package", you (meaning the cabal mechanism, not the users) need to depend on this set of packages instead - which is repeatedly expanded until only ghc-packages (or the equivalent for whatever compiler it is) are in the list. Isaac

Thomas Schilling wrote:
On Sun, 2007-10-28 at 13:49 -0400, Isaac Dupree wrote:
Simon Marlow wrote:
Probably it could be made clearer. In 4.2 the idea is that instead of replacing
base-2.0 ==> base-3.0 + directory-1.0 + array-1.0 + ...
you would replace
base-2.0 ==> newbase-1.0 + directory-1.0 + array-1.0 + ...
and additionally have a package base-3.0 that re-exports the whole of (newbase + directory + array + ...). "Macros" in cabal: Why not just say that depending on base-3.0 actually means that you have to depend on newbase-1.0, and directory-1.0, etc... Why is compiler support needed? is it really possible that I still don't understand?
You'd still have the problem that every package has to specify this "macro" for itself. You'd want some global macro-database to avoid this. The better solution would be to just have a package that re-exports everything. I.e., the definition of package base-2.0 would look something like this:
if has system has base-3.0, directory, array ... re-export else exposed-modules: Data.Maybe, Data.List, ...
And this runs into trouble when base-3.0 has changes to datatypes and/or classes relative to base-2.0 (see 4.1 of http://hackage.haskell.org/trac/ghc/wiki/PackageCompatibility). This approach doesn't scale; it may be useful for providing backwards-compatible versions of smaller packages, but impractical for base. Cheers, Simon

Simon Marlow
Looks like an orthogonal issue to me. Am I missing something?
Probably it could be made clearer. In 4.2 the idea is that instead of replacing
base-2.0 ==> base-3.0 + directory-1.0 + array-1.0 + ...
you would replace
base-2.0 ==> newbase-1.0 + directory-1.0 + array-1.0 + ...
That's pretty clear, thanks. Couldn't you still do this by recompiling the libraries, *if* the re-exporting feature turns out to be stumbling block.
Thinking about this is really making my head hurt, I need to go back to doing something easy like writing Haskell :-)
If it leads to better/more robust libraries, I'm happy to pick up the bill for that bottle of aspirin. :-) -k -- If I haven't seen further, it is by standing in the footprints of giants

Simon Marlow wrote:
Probably it could be made clearer. In 4.2 the idea is that instead of replacing
base-2.0 ==> base-3.0 + directory-1.0 + array-1.0 + ...
you would replace
base-2.0 ==> newbase-1.0 + directory-1.0 + array-1.0 + ...
and additionally have a package base-3.0 that re-exports the whole of (newbase + directory + array + ...).
There's another alternative, that Simon PJ pointed out to me this morning. Actually I think this is rather nice, as it solves the problem of having to rename base for each split. Let's suppose that in GHC 6.10 we want to split some modules from base again (highly likely). So we make this change, for example: base-3.0 ===> base-4.0 + concurrent-1.0 + generics-1.0 note that the new, smaller, base is called base-4.0. Then, we provide a wrapper package base-3.1, that re-exports (base-4.0 + concurrent-1.0 + generics-1.0). Why is it called 3.1, not 3.0? Because almost certainly by the time GHC 6.10 is released there will be some API changes in these packages too, and we aren't trying to reproduce the base-3.0 API exactly (that's proposal 4.1). This amounts to adopting a convention for the base package that changes in the first component of the version indicate splits or removal of modules, and changes to the second component indicate other API changes only. The idea is that all packages on Hackage will be using precise dependencies by this point, so they will all have something like build-depends: base-3.0.* Now, when GHC 6.10 is released, the majority of these packages will build again if the .cabal file is changed to say build-depends: base >= 3.0 && <= 3.1 rather than introducing conditional dependencies as was necessary with the recent base split. In due course, packages can be upgraded to depend on the new base-4.0. I'll call this proposal 4.3 and add it to the wiki. It seems like a pretty good compromise to me. Cheers, Simon

Simon Marlow wrote:
Now, when GHC 6.10 is released, the majority of these packages will build again if the .cabal file is changed to say
build-depends: base >= 3.0 && <= 3.1
rather than introducing conditional dependencies as was necessary with the recent base split. In due course, packages can be upgraded to depend on the new base-4.0.
I'll call this proposal 4.3 and add it to the wiki. It seems like a pretty good compromise to me.
yay, a good naming proposal! This explains what I meant by the thing with macros - of course it cannot provide base-2.0, but it provides something close to that base API, such as you would expect from a version number increase. what if the ghc 6.8 branch's base has to change enough that it needs to be incremented to 3.1 by PVP? Or maybe that should never happen. but, oh, the .cabal file should be changed _after_ the release of 6.10 (so that it will be accurate) (otherwise it might be >= 3.0 && < 4) so we'll know what the version numbers are (as long as we never do a point release of a previous GHC branch, once "3.1" has been taken by 6.10's base) It could be "3.5"... but then that theoretical 3.1 would be considered a valid dependency even if it contained different API changes that broke things... so, okay Isaac

Simon Marlow
http://hackage.haskell.org/trac/ghc/wiki/PackageCompatibility
Good summary. One minor point: the section at the end, on "The problem of lax version dependencies", refers to solution 3, but there _is_ no solution 3: the list jumps direct from 2 to 4. So I couldn't follow the reasoning there. Regards, Malcolm

i've forwarded this message from libraries list because i think it's very
important topic for Haskell growth. please answer only to libraries@haskell.org
This is a forwarded message
From: Simon Marlow

Duncan and I have started a wiki page to collect proposals for ways to avoid or alleviate the pain from future package reorganisations.
http://hackage.haskell.org/trac/ghc/wiki/PackageCompatibility
i've filled in the section on provides/requires with what i remember from our discussion. claus

On Sat, 2007-10-27 at 01:14 +0100, Claus Reinke wrote:
Duncan and I have started a wiki page to collect proposals for ways to avoid or alleviate the pain from future package reorganisations.
http://hackage.haskell.org/trac/ghc/wiki/PackageCompatibility
i've filled in the section on provides/requires with what i remember from our discussion.
Thanks. I quite like the idea of inferring in much more detail what a package actually needs from its imports. Having Cabal do the module dependency analysis will make this a bit easier, but doing it at a finer granularity than the module list will require more complicated language processing. I suppose there is the question of how much should be in the language (like ML functors) and how much in the packaging system. Duncan
participants (9)
-
Bulat Ziganshin
-
Claus Reinke
-
Duncan Coutts
-
Ian Lynagh
-
Isaac Dupree
-
Ketil Malde
-
Malcolm Wallace
-
Simon Marlow
-
Thomas Schilling