
Personally, I would prefer "cabal install" to automatically refresh its own local cache (when appropriate), without this extra step. If cabal is going to reach out over the net to download package sources, it might as well ensure that it collects the latest version of the index first.
The difficulty with this is that we use cabal install for thing we expect to be local. We also support offline operations, we have cabal fetch for just this reason.
Yes, I think it is important that cabal install can be used off-line. Probably the design should be thus: * construct a build plan using local cache * decide whether network downloads are required * if not, proceed with plan and local resources * if yes, then download new index, re-do the plan, and proceed. I don't know how soon during the construction of the build plan cabal can detect that downloads are required, i.e. only after the full plan is constructed, or as soon as any non-local package is mentioned. However, in any case, surely the planning phase is not so computationally expensive that it doing it twice would give a noticeable delay?
The other thing is that sometimes you don't want the package database to be silently updated. You want to stick with a particular snapshot. It's important that network access be fairly clear to users.
If users want to stick with a snapshot index, that could be a cmdline option, e.g. cabal install --not-latest (or --no-update?). I do think that the current behaviour is occasionally less than transparent to users, and a sensible default would be to auto-update. Regards, Malcolm