
On Tue, Feb 25, 2014 at 4:51 PM, Vincent Hanquez
I'm not saying this is not painful, but i've done it in the past, and using dichotomy and educated guesses (for example not using libraries released after a certain date), you converge pretty quickly on a solution.
But the bottom line is that it's not the common use case. I rarely have to dig old unused code.
And I have code that I would like to have working today, but it's too expensive to go through this process. The code has significant value to me and other people, but not enough to justify the large cost of getting it working again.
This is moot IMHO. A large organisation would *not* rely on cabal, nor the PvP to actually download packages properly:
Sorry, let me rephrase. s/Large organizations/organizations/ Not everyone is big enough to devote the kind of resources it would take to set up their own system. I've personally worked at two such companies. Building tools that can serve the needs of these organizations will help the Haskell community as a whole.
Not only this is insecure, and as Michael mentioned, you would not get the guarantee you need anyway.
In many cases security doesn't matter because code doesn't interact with the outside world. We're not talking about guaranteeing that building with a later version is buggy. We're talking about guaranteeing that the package will work the way it always worked. It's kind of a package-level purity/immutability.
Even if the above wasn't an issue, Haskell doesn't run in a bubble. I don't expect old ghc and old packages to work with newer operating systems and newer libraries forever.
I don't expect this either. I expect old packages to work the way they always worked with the packages they always worked with.