Delay between hackage upload and cabal-install seeing package

I've uploaded Takusen-0.8.5 to hackage, and it's visible at http://hackage.haskell.org/cgi-bin/hackage-scripts/package/Takusen but "cabal install Takusen" still tries to get 0.8.4. Is there a separate database/file for cabal install which is rebuilt periodically, and it so, how often? i.e. how long is the delay between upload and being able to test cabal-install? Thanks, Alistair

On Thu, May 14, 2009 at 10:52:36AM +0100, Alistair Bayley wrote:
I've uploaded Takusen-0.8.5 to hackage, and it's visible at http://hackage.haskell.org/cgi-bin/hackage-scripts/package/Takusen
but "cabal install Takusen" still tries to get 0.8.4. Is there a separate database/file for cabal install which is rebuilt periodically, and it so, how often? i.e. how long is the delay between upload and being able to test cabal-install?
The database is http://hackage.haskell.org/packages/archive/00-index.tar.gz It's updated before the upload returns, and indeed Takusen-0.8.5 is in there. A web caching glitch?

2009/5/14 Ross Paterson
On Thu, May 14, 2009 at 10:52:36AM +0100, Alistair Bayley wrote:
I've uploaded Takusen-0.8.5 to hackage, and it's visible at http://hackage.haskell.org/cgi-bin/hackage-scripts/package/Takusen
but "cabal install Takusen" still tries to get 0.8.4. Is there a separate database/file for cabal install which is rebuilt periodically, and it so, how often? i.e. how long is the delay between upload and being able to test cabal-install?
The database is
http://hackage.haskell.org/packages/archive/00-index.tar.gz
It's updated before the upload returns, and indeed Takusen-0.8.5 is in there. A web caching glitch?
Possibly, within cabal? I said "cabal update" and that seemed to fix it. Does cabal-install check that it's local database is up to date? Alistair

On Thu, May 14, 2009 at 10:52:36AM +0100, Alistair Bayley wrote:
I've uploaded Takusen-0.8.5 to hackage... but "cabal install Takusen" still tries to get 0.8.4. Is there a separate database/file for cabal install ...
Yes, and it is stored locally on your machine. You must do cabal update to refresh your local machine cache. Personally, I would prefer "cabal install" to automatically refresh its own local cache (when appropriate), without this extra step. If cabal is going to reach out over the net to download package sources, it might as well ensure that it collects the latest version of the index first. Regards, Malcolm

2009/5/14 Malcolm Wallace
Yes, and it is stored locally on your machine. You must do cabal update to refresh your local machine cache.
Personally, I would prefer "cabal install" to automatically refresh its own local cache (when appropriate), without this extra step. If cabal is going to reach out over the net to download package sources, it might as well ensure that it collects the latest version of the index first.
Ah. I assumed that it already did that. Thanks, Alistair

On Thu, 2009-05-14 at 12:09 +0100, Malcolm Wallace wrote:
On Thu, May 14, 2009 at 10:52:36AM +0100, Alistair Bayley wrote:
I've uploaded Takusen-0.8.5 to hackage... but "cabal install Takusen" still tries to get 0.8.4. Is there a separate database/file for cabal install ...
Yes, and it is stored locally on your machine. You must do cabal update to refresh your local machine cache.
Personally, I would prefer "cabal install" to automatically refresh its own local cache (when appropriate), without this extra step. If cabal is going to reach out over the net to download package sources, it might as well ensure that it collects the latest version of the index first.
I accept that the current behaviour is not perfect but I don't see an obvious perfect solution. The difficulty with this is that we use cabal install for thing we expect to be local. We don't know that we will need to go to the network until we've planned what to do, and that involves having the info already. We also support offline operations, we have cabal fetch for just this reason. I added a check for the package db being very old, and we could make the error message you get when a package is not found mention that you might want to update. The other thing is that sometimes you don't want the package database to be silently updated. You want to stick with a particular snapshot. It's important that network access be fairly clear to users. Duncan

Personally, I would prefer "cabal install" to automatically refresh its own local cache (when appropriate), without this extra step. If cabal is going to reach out over the net to download package sources, it might as well ensure that it collects the latest version of the index first.
The difficulty with this is that we use cabal install for thing we expect to be local. We also support offline operations, we have cabal fetch for just this reason.
Yes, I think it is important that cabal install can be used off-line. Probably the design should be thus: * construct a build plan using local cache * decide whether network downloads are required * if not, proceed with plan and local resources * if yes, then download new index, re-do the plan, and proceed. I don't know how soon during the construction of the build plan cabal can detect that downloads are required, i.e. only after the full plan is constructed, or as soon as any non-local package is mentioned. However, in any case, surely the planning phase is not so computationally expensive that it doing it twice would give a noticeable delay?
The other thing is that sometimes you don't want the package database to be silently updated. You want to stick with a particular snapshot. It's important that network access be fairly clear to users.
If users want to stick with a snapshot index, that could be a cmdline option, e.g. cabal install --not-latest (or --no-update?). I do think that the current behaviour is occasionally less than transparent to users, and a sensible default would be to auto-update. Regards, Malcolm

On Fri, 2009-05-15 at 15:23 +0100, Malcolm Wallace wrote:
Personally, I would prefer "cabal install" to automatically refresh its own local cache (when appropriate), without this extra step. If cabal is going to reach out over the net to download package sources, it might as well ensure that it collects the latest version of the index first.
The difficulty with this is that we use cabal install for thing we expect to be local. We also support offline operations, we have cabal fetch for just this reason.
Yes, I think it is important that cabal install can be used off-line. Probably the design should be thus:
* construct a build plan using local cache * decide whether network downloads are required * if not, proceed with plan and local resources * if yes, then download new index, re-do the plan, and proceed.
I don't know how soon during the construction of the build plan cabal can detect that downloads are required, i.e. only after the full plan is constructed, or as soon as any non-local package is mentioned. However, in any case, surely the planning phase is not so computationally expensive that it doing it twice would give a noticeable delay?
Probably not noticable. Getting the index however is quite noticable. It's currently getting rather large (much bigger than most packages). I think we'd want to implement a scheme to trickle updates before we start to grab updates more frequently/implicitly.
The other thing is that sometimes you don't want the package database to be silently updated. You want to stick with a particular snapshot. It's important that network access be fairly clear to users.
If users want to stick with a snapshot index, that could be a cmdline option, e.g. cabal install --not-latest (or --no-update?). I do think that the current behaviour is occasionally less than transparent to users, and a sensible default would be to auto-update.
Yes, the first thing to do would be to add a mode switch that says if the user wants offline or online behaviour. Then we can think about which circumstances the default should be one or the other and let user override it when they want the non-default. Duncan
participants (4)
-
Alistair Bayley
-
Duncan Coutts
-
Malcolm Wallace
-
Ross Paterson