Re: Breaking Changes and Long Term Support Haskell

On Wed, Oct 21, 2015 at 8:42 PM, Gregory Collins
On Wed, Oct 21, 2015 at 3:18 PM, Geoffrey Mainland
wrote: My original email stated my underlying concern: we are losing valuable members of the community not because of the technical decisions that are being made, but because of the process by which they are being made.
[If] you're doing research you're on the treadmill, almost by definition, and you're delighted that we're finally making some rapid progress on fixing up some of the longstanding warts.
If you're a practitioner, you are interested in using Haskell for, y'know, writing programs. You're probably in one of two camps: you're in "green field" mode writing a lot of new code (early stage startups, prototype work, etc), or you're maintaining/extending programs you've already written that are out "in the field" for you doing useful work. Laura Wingerd calls this the "annealing temperature" of software, and I think this is a nice metaphor to describe it. How tolerant you are of ecosystem churn depends on what your temperature is: and I think it should be obvious to everyone that Haskell having "success" for programming work would mean that lots of useful and correct programs get written, so everyone who is in the former camp will cool over time to join the latter.
I've made the point before and I don't really want to belabor it: our de
facto collective posture towards breaking stuff, especially in the past few years, has been extremely permissive, and this alienates people who are maintaining working programs.
Even among people who purported to be teaching Haskell or using Haskell today in industry the margin of preference for the concrete FTP proposal was ~79%. This was considerably higher than I expected in two senses. One: there were a lot more people who claimed to be in one of those two roles than I expected by far, and two: their appetite for change was higher than I expected. I initially expected to see a stronger "academic vs. industry" split in the poll, but the groups were only distinguishable by a few percentage point delta, so while I expected roughly the end percentage of the poll, based on the year prior I'd spent running around the planet to user group meetings and the like, I expected it mostly because I expected more hobbyists and less support among industrialists.
I'm actually firmly of the belief that the existing committee doesn't
really have process issues, and in fact, that often it's been pretty careful to minimize the impact of the changes it wants to make. As others have pointed out, lots of the churn actually comes from platform libraries, which are out of the purview of this group.
Historically we've had a bit of a split personality on this front. Nothing that touches the Prelude had changed in 17 years. On the other hand the platform libraries had maintained a pretty heavy rolling wave of breakage the entire time I've been around in the community. On a more experimental feature front, I've lost count of the number of different things we've done to Typeable or template-haskell.
All I'm saying is that if we want to appeal to or cater to working software engineers, we have to be a lot less cavalier about causing more work for them, and we need to prize stability of the core infrastructure more highly. That'd be a broader cultural change, and that goes beyond process: it's policy.
The way things are shaping up, we've had 17 years of rock solid stability, 1 release that incorporated changes that were designed to minimize impact, to the point that the majority of the objections against them are of the form where people would prefer that we broke _more_ code, to get a more sensible state. Going forward, it looks like the next 2 GHC releases will have basically nothing affecting the Prelude, and there will be another punctuation in the equilibrium around 8.4 as the next set of changes kicks in over 8.4 and 8.6 That gives 2 years worth of advance notice of pending changes, and a pretty strong guarantee from the committee that you should be able to maintain code with a 3 release window without running afoul of warnings or needing CPP. So, out of curiosity, what additional stability policy is it that you seek? -Edward

On 10/22/2015 02:40 AM, Edward Kmett wrote:
On Wed, Oct 21, 2015 at 8:42 PM, Gregory Collins
mailto:greg@gregorycollins.net> wrote: On Wed, Oct 21, 2015 at 3:18 PM, Geoffrey Mainland
mailto:mainland@apeiron.net> wrote: My original email stated my underlying concern: we are losing valuable members of the community not because of the technical decisions that are being made, but because of the process by which they are being made.
[If] you're doing research you're on the treadmill, almost by definition, and you're delighted that we're finally making some rapid progress on fixing up some of the longstanding warts.
If you're a practitioner, you are interested in using Haskell for, y'know, writing programs. You're probably in one of two camps: you're in "green field" mode writing a lot of new code (early stage startups, prototype work, etc), or you're maintaining/extending programs you've already written that are out "in the field" for you doing useful work. Laura Wingerd calls this the "annealing temperature" of software, and I think this is a nice metaphor to describe it. How tolerant you are of ecosystem churn depends on what your temperature is: and I think it should be obvious to everyone that Haskell having "success" for programming work would mean that lots of useful and correct programs get written, so everyone who is in the former camp will cool over time to join the latter.
I've made the point before and I don't really want to belabor it: our de facto collective posture towards breaking stuff, especially in the past few years, has been extremely permissive, and this alienates people who are maintaining working programs.
Even among people who purported to be teaching Haskell or using Haskell today in industry the margin of preference for the concrete FTP proposal was ~79%. This was considerably higher than I expected in two senses. One: there were a lot more people who claimed to be in one of those two roles than I expected by far, and two: their appetite for change was higher than I expected. I initially expected to see a stronger "academic vs. industry" split in the poll, but the groups were only distinguishable by a few percentage point delta, so while I expected roughly the end percentage of the poll, based on the year prior I'd spent running around the planet to user group meetings and the like, I expected it mostly because I expected more hobbyists and less support among industrialists.
I'm actually firmly of the belief that the existing committee doesn't really have process issues, and in fact, that often it's been pretty careful to minimize the impact of the changes it wants to make. As others have pointed out, lots of the churn actually comes from platform libraries, which are out of the purview of this group.
Historically we've had a bit of a split personality on this front. Nothing that touches the Prelude had changed in 17 years. On the other hand the platform libraries had maintained a pretty heavy rolling wave of breakage the entire time I've been around in the community. On a more experimental feature front, I've lost count of the number of different things we've done to Typeable or template-haskell.
All I'm saying is that if we want to appeal to or cater to working software engineers, we have to be a lot less cavalier about causing more work for them, and we need to prize stability of the core infrastructure more highly. That'd be a broader cultural change, and that goes beyond process: it's policy.
The way things are shaping up, we've had 17 years of rock solid stability, 1 release that incorporated changes that were designed to minimize impact, to the point that the majority of the objections against them are of the form where people would prefer that we broke _more_ code, to get a more sensible state. Going forward, it looks like the next 2 GHC releases will have basically nothing affecting the Prelude, and there will be another punctuation in the equilibrium around 8.4 as the next set of changes kicks in over 8.4 and 8.6 That gives 2 years worth of advance notice of pending changes, and a pretty strong guarantee from the committee that you should be able to maintain code with a 3 release window without running afoul of warnings or needing CPP.
So, out of curiosity, what additional stability policy is it that you seek?
Thanks to you and Dan [1], I now have a greater understanding and appreciation for where the committee has been coming from. My new understanding is that the changes that were formalized in AMP, FTP, and MRP were the basis for the committee's creation. It also seems that there are more changes in the pipeline that have not yet been made into proposals, e.g., pulling (>>) out of Control.Monad [2]. Part of "stability" is signaling change as far ahead as possible. The committee has put a lot of effort into this, which I appreciate! However, as each of these proposal has come down the pipeline, I never realized that they were part of a larger master plan. 1) What is the master plan, and where is it documented, even if this document is not up to the standard of a proposal? What is the final target, and when might we expect it to be reached? What is in the pipeline after MRP? Relatedly, guidance on how to write code now so that it will be compatible with future changes helps mitigate the stability issue. 2) How can I write code that makes use of the Prelude so that it will work with every new GHC release over the next 3 years? 5 years? For example, how can I write a Monad instance now, knowing the changes that are coming, so that the instance will work with every new GHC release for the next 3 years? 5 years? If the answer is "you can't," then when might I be able to do such a thing? As of 8.4? 8.6? I'm embarrassed to say I don't know the answer! Finally, if none of these changes broke Prelude backwards compatibility, far fewer people would be complaining :) Of course, we can't always make progress without breaking things, but a more deliberative process might offer an opportunity to make progress while still preserving backwards compatibility. Take AMP for example. There were at least two [3] [4] proposals for preserving backwards compatibility. Investigating them would have taken time and delayed AMP, yes, but why the rush? 3) Can we have a process that allows more deliberation over, and wider publicity for, changes that break backwards compatibility? The goal of such a process would not be to prevent change, but to allow more time to find possible solution to the issue of backwards compatibility. My proposal for a low-traffic mailing list where all proposals were announced was meant to provide wider publicity. Personally, I think these proposals do indeed fix a lot of warts in the language. As a researcher who uses actively uses Haskell every day, these warts have had approximately zero impact on me for the past (almost) decade, and I would be perfectly content if they were never fixed. The only pain I can recall enduring is having to occasionally write an orphan Applicative instance. I have been importing Prelude hiding mapM for years. I have been importing Control.Applicative for years. Neither has been painful. Dealing with AMP? I'm working on a collaborative research project that is stuck on 7.8 because of AMP. I agree, that seems silly, but whether or not it is silly, it is an impact I feel. One way to look at these proposals is to ask the question "Wouldn't the language be nicer if all these changes were made?" Another is to ask the question "Does the fact that these changes have not been made make your life as a Haskell programmer more difficult in any significant way?" I answer "yes" to the former and "no" to the latter. Is our stance that answering "yes" to the former question is enough to motivate braking change? Shouldn't a answer "no" to the latter question cause some hesitation? Maybe there are a lot of people who answer "yes" to both questions. I would like to know! But does having return in the Monad class really cause anyone anything other than existential pain? Cheers, Geoff [1] https://mail.haskell.org/pipermail/libraries/2015-October/026390.html [2] https://mail.haskell.org/pipermail/libraries/2015-September/026158.html [3] https://ghc.haskell.org/trac/ghc/wiki/InstanceTemplates [4] https://ghc.haskell.org/trac/ghc/wiki/IntrinsicSuperclasses

On 15-10-22 09:29 AM, Geoffrey Mainland wrote:
...
1) What is the master plan, and where is it documented, even if this document is not up to the standard of a proposal? What is the final target, and when might we expect it to be reached? What is in the pipeline after MRP?
Relatedly, guidance on how to write code now so that it will be compatible with future changes helps mitigate the stability issue.
I have been fully in favour of all the proposals implemented so far, and I think that having an explicit master plan would be a great idea. It would address some of the process-related objections that have been raised, and it would provide a fixed long-term target that would be much easier to make the whole community aware of and contribute to. For that purpose, the master plan should be advertised directly on the front page of haskell.org. Once we have it settled and agreed, the purpose of the base-library commitee would essentially become to figure out the details like the timeline and code migration path. One thing they wouldn't need to worry about is whether anybody disagrees with their goals.
2) How can I write code that makes use of the Prelude so that it will work with every new GHC release over the next 3 years? 5 years? For example, how can I write a Monad instance now, knowing the changes that are coming, so that the instance will work with every new GHC release for the next 3 years? 5 years? If the answer is "you can't," then when might I be able to do such a thing? As of 8.4? 8.6? I'm embarrassed to say I don't know the answer!
From the discussions so far it appears that the answer for 3 years (or at least the next 3 GHC releases) would be to write the code that works with the current GHC and base, but this policy has not been codified anywhere yet. Knowing the upcoming changes doesn't help with making your code any more robust, and I think that's a shame. We could have a two-pronged policy: - code that works and compiles with the latest GHC with no *warnings* will continue to work and compile with no *errors* with the following 2 releases, and - code that also follows the forward-compatibility recommendations current for that version of GHC will continue to work and compile with no *errors* with the following 4 releases. The forward-compatibility recommendations would become a part of the online GHC documentation so nobody complains they didn't know about them. Personally, I'd prefer if the recommendations were built into the compiler itself as a new class of warnings, but then (a) some people would insist on turning them on together with -Werror and then complain when their builds break and (b) this would increase the pressure on GHC implementors.
Finally, if none of these changes broke Prelude backwards compatibility, far fewer people would be complaining :) Of course, we can't always make progress without breaking things, but a more deliberative process might offer an opportunity to make progress while still preserving backwards compatibility. Take AMP for example. There were at least two [3] [4] proposals for preserving backwards compatibility. Investigating them would have taken time and delayed AMP, yes, but why the rush?
Because they have been investigated for years with no effect.
3) Can we have a process that allows more deliberation over, and wider publicity for, changes that break backwards compatibility? The goal of such a process would not be to prevent change, but to allow more time to find possible solution to the issue of backwards compatibility.
I doubt we can, but this question has already been answered by others.

On Thu, Oct 22, 2015 at 12:20 PM, Mario Blažević
On 15-10-22 09:29 AM, Geoffrey Mainland wrote:
...
1) What is the master plan, and where is it documented, even if this document is not up to the standard of a proposal? What is the final target, and when might we expect it to be reached? What is in the pipeline after MRP?
Relatedly, guidance on how to write code now so that it will be compatible with future changes helps mitigate the stability issue.
I have been fully in favour of all the proposals implemented so far, and I think that having an explicit master plan would be a great idea. It would address some of the process-related objections that have been raised, and it would provide a fixed long-term target that would be much easier to make the whole community aware of and contribute to.
For that purpose, the master plan should be advertised directly on the front page of haskell.org. Once we have it settled and agreed, the purpose of the base-library commitee would essentially become to figure out the details like the timeline and code migration path. One thing they wouldn't need to worry about is whether anybody disagrees with their goals.
2) How can I write code that makes use of the Prelude so that it will
work with every new GHC release over the next 3 years? 5 years? For example, how can I write a Monad instance now, knowing the changes that are coming, so that the instance will work with every new GHC release for the next 3 years? 5 years? If the answer is "you can't," then when might I be able to do such a thing? As of 8.4? 8.6? I'm embarrassed to say I don't know the answer!
From the discussions so far it appears that the answer for 3 years (or at least the next 3 GHC releases) would be to write the code that works with the current GHC and base, but this policy has not been codified anywhere yet. Knowing the upcoming changes doesn't help with making your code any more robust, and I think that's a shame. We could have a two-pronged policy:
- code that works and compiles with the latest GHC with no *warnings* will continue to work and compile with no *errors* with the following 2 releases, and - code that also follows the forward-compatibility recommendations current for that version of GHC will continue to work and compile with no *errors* with the following 4 releases.
We have adopted a "3 release policy" facing backwards, not forwards. However, all proposals currently under discussion actually meet a stronger condition, a 3 release policy that you can slide both forward and backwards to pick the 3 releases you want to be compatible with without using CPP. It also appears that all of the changes that we happen to have in the wings https://ghc.haskell.org/trac/ghc/wiki/Status/BaseLibrary comply with both of your goals here. However, I hesitate to say that we can simultaneously meet this goal and the 3 release policy facing backwards _and_ sufficient notification in all situations even ones we can't foresee today. As a guideline? Sure. If we have two plans that can reach the same end-goal and one complies and the other doesn't, I'd say we should favor the plan that gives more notice and assurance. However, this also needs to be tempered against the number of years folks suffer the pain of having in an inconsistent intermediate state. (e.g. having generalized combinators in Data.List today) The forward-compatibility recommendations would become a part of
the online GHC documentation so nobody complains they didn't know about them. Personally, I'd prefer if the recommendations were built into the compiler itself as a new class of warnings, but then (a) some people would insist on turning them on together with -Werror and then complain when their builds break and (b) this would increase the pressure on GHC implementors.
The current discussion is centering around adding a -Wcompat flag that warns of changes that you maybe can't yet implement in a way that would be backwards compatible with a 3 release backwards-facing window, but which will eventually cause issues. -Edward

On Thu, Oct 22, 2015 at 9:29 AM, Geoffrey Mainland
Thanks to you and Dan [1], I now have a greater understanding and appreciation for where the committee has been coming from. My new understanding is that the changes that were formalized in AMP, FTP, and MRP were the basis for the committee's creation. It also seems that there are more changes in the pipeline that have not yet been made into proposals, e.g., pulling (>>) out of Control.Monad [2]. Part of "stability" is signaling change as far ahead as possible. The committee has put a lot of effort into this, which I appreciate! However, as each of these proposal has come down the pipeline, I never realized that they were part of a larger master plan.
The "master plan" where (>>) is concerned is that it'd be nice to get Traversable down to a minimal state and to eliminate unnecessary distinctions in the Prelude between things like mapM and traverse. Right now they have different type constraints, but this is entirely a historical artifact. But it causes problems, we have a situation where folks have commonly optimized (>>) but left (*>) unfixed. This yields different performance for mapM_ and traverse_. A consequence of the AMP is that the neither one of those could be defined in terms of the other (*>) has a default definition in terms of (<*>). (>>) has a default definition in terms of (>>=). With two places where optimizations can happen and two different definitions for operations that are logically required to be the same thing we can and do see rather radically different performance between these two things. This proposal is something that was put out as a sort of addendum to the Monad of No Return proposal for discussion, but unlike MRP has no particular impact on a sacred cow like return. We have yet to put together a timeline that incorporates the (>>) changes from MRP. 1) What is the master plan, and where is it documented, even if this
document is not up to the standard of a proposal? What is the final target, and when might we expect it to be reached? What is in the pipeline after MRP?
Relatedly, guidance on how to write code now so that it will be compatible with future changes helps mitigate the stability issue.
The current plans more or less stop with finishing the MonadFail proposal, getting Semigroup in as a superclass of Monoid, and incorporating some additional members into Floating. The working document for the timeline going forward is available here: https://ghc.haskell.org/trac/ghc/wiki/Status/BaseLibrary
2) How can I write code that makes use of the Prelude so that it will work with every new GHC release over the next 3 years? 5 years? For example, how can I write a Monad instance now, knowing the changes that are coming, so that the instance will work with every new GHC release for the next 3 years? 5 years? If the answer is "you can't," then when might I be able to do such a thing? As of 8.4? 8.6? I'm embarrassed to say I don't know the answer!
We have a backwards facing "3 release policy" that says it should always be possible to write code that works backwards for 3 releases. This means that changes like moving fail out of Monad will take 5 years. However, maintaining both that and a _forward facing_ 3 release policy would mean that any change that introduced a superclass would take something like 9 years of intermediate states that make no sense to complete. *9 years to move one method.* Now looking forward. You can write code today with 7.10 that will work without warnings until 8.2. That happens to be 3 releases. In 8.4 you'll start to get warnings about Semigroup and MonadFail changes, but looking at it as 3 releases going forward in 8.0 you can just write the instances and your code would be warning free forward for 3 releases. In 8.6 those changes go into effect, but you will have been able to make the code changes that you need to accomodate 8.6 since 8.0. The current roadmap happens to give you a 3 year sliding window. Finally, if none of these changes broke Prelude backwards compatibility,
far fewer people would be complaining :)
If none of our changes were ever able to break Prelude backwards compatibility the same people who have been complaining about the utter lack of progress for the previous 17 years and that nearly exploded the community 2 years ago would be complaining, and based on polling and discusssions that is actually a much larger group. The AMP passed nearly unanimously.
Of course, we can't always make progress without breaking things, but a more deliberative process might offer an opportunity to make progress while still preserving backwards compatibility. Take AMP for example. There were at least two [3] [4] proposals for preserving backwards compatibility. Investigating them would have taken time and delayed AMP, yes, but why the rush?
We've been talking about various superclass defaulting proposals for the better part of a decade and no progress has been made. The rush was that we'd been letting them block every previous discussion, and that the concrete plan with an actual implementation that was on hand was a very popular proposal even without that mitigation strategy. 3) Can we have a process that allows more deliberation over, and wider
publicity for, changes that break backwards compatibility? The goal of such a process would not be to prevent change, but to allow more time to find possible solution to the issue of backwards compatibility.
My proposal for a low-traffic mailing list where all proposals were announced was meant to provide wider publicity.
I don't think anybody has an objection to wider visibility of proposals that affect things mentioned in the Haskell Report.
Personally, I think these proposals do indeed fix a lot of warts in the language. As a researcher who uses actively uses Haskell every day, these warts have had approximately zero impact on me for the past (almost) decade, and I would be perfectly content if they were never fixed. The only pain I can recall enduring is having to occasionally write an orphan Applicative instance. I have been importing Prelude hiding mapM for years. I have been importing Control.Applicative for years. Neither has been painful.
And yet the vast preponderance of public opinion lies in the other camp. The "change nothing" policy had an iron grip on the state of affairs for 17 years and there were serious cracks starting to form from the appearance that nothing could ever be fixed if the Prelude was affected in any way. The only thing that broke with that was when Ian Lynagh unilaterally removed Eq and Show as superclasses of Num. That was more or less the first glimmer that the world wouldn't end if deliberated changes were made to the Prelude. Dealing with AMP? I'm working on a
collaborative research project that is stuck on 7.8 because of AMP. I agree, that seems silly, but whether or not it is silly, it is an impact I feel.
What changes did you face beyond writing instance Functor Foo where fmap = liftM instance Applicative Foo where pure = return (<*>) = ap that is AMP related? Maybe there are a lot of people who answer "yes" to both questions. I
would like to know! But does having return in the Monad class really cause anyone anything other than existential pain?
The MRP is by far the most marginal proposal on the table. This is why it remains *just a proposal* and not part of the roadmap. That said, moving return to a top level definition will mean that more code that is compiled will be able to infer an Applicative constraint. The other proposals that are on the roadmap on the other hand defend a lot better. The (>>) fragment of MRP fixes rampant performance regressions, however. We went to generalize the implementation of mapM_ to use (*>) internally and found performance regressions within base itself due to instances that are optimized inconsistently. This informed the design here. More code will infer with weaker Applicative constraints, Traversable can eventually be simplified, and folks like Simon Marlow who have folks internally at Facebook use mapM will just have their code "work" in Haxl. I can answer "yes" to both of your questions here. The continued existence of fail in Monad on the other hand has caused a great deal of pain in instances for things like `Either a` for years. To supply `fail`, we used to incur a needless Error a constraint. We can be more precise and remove a potential source of partiality from a lot of code. I can answer "yes" to both of your questions here. The lack of Semigroup as a superclass of Monoid has meant that the Monoid instance for Maybe adds a unit to something that already has a unit. It means that First and Last, etc. all useless tack an extra case that everyone has to handle in. It has dozens of knock-on consequences. Much code that currently only needs a semigroup falls back on a monoid because of the lack of a proper class relationship or gets duplicated. I can answer "yes" to both of your questions here. The numerics changes to Floating mean that Haskell numerics just have awful precision. Adding expm1, etc. to Floating means that people will be able to write decent numerical code without having to choose between generality (using exp from Floating that works everywhere) and accuracy. I can answer "yes" to both of your questions here. -Edward
participants (3)
-
Edward Kmett
-
Geoffrey Mainland
-
Mario Blažević