
On 4/24/2012 11:49 PM, wren ng thornton wrote:
On 4/24/12 9:59 AM, Gregg Lebovitz wrote:
The question of how to support rapid innovation and stable deployment is not an us versus them problem. It is one of staging releases. The Linux kernel is a really good example. The Linux development team innovates faster than the community can absorb it. The same was true of the GNU team. Distributions addressed the gap by staging releases.
In that case, what you are interested in is not Hackage (the too-fast torrent of development) but rather the Haskell Platform (a policed set of stable/core libraries with staged releases).
No, that was not what I was thinking because a stable policed set of core libraries is at the opposite end of the spectrum from how you describe Hackage. What I am suggesting is a way of creating an upstream that feeds increasingly stable code into an ever increasing set of stable and useful components. Using the current open system model, the core compiler team for gcc releases the compiler and a set of libstdc and libstdc++ libraries. The GNU folks release more useful libraries, and then projects like GNOME build on the other components. Right now we have Hackage that moves to fast and the Haskell core that rightfully moves more slowly. Maybe the answer is to add a rating system to Hackage and mark packages as experimental, unsupported, and supported, or use a 5 star rating system like the app store. Later on when we have appropriate testing tools, we can include a rating from the automated tests.
I forget who the best person to contact is these days if you want to get involved with helping the HP, but I'm sure someone on the list will say shortly :)