Comment on "The Historical Futurism of Haskell" by Andrew Boardman

Hi! I am referring to the impressive talk/show/thought-provoking message about "The Historical Futurism of Haskell" delivered by Andrew Boardman at the Haskell Love Conference. I am not sure if I am allowed to link to the talk myself, since it is password protected, and was part of a conference. Maybe this can be done by the conference organizers, or by Andrew himself, if they please. In summary, and please add your comments or correct me, Andrew discusses that we truly believe that Haskell --- as a statically typed, lazy, pure, and functional language -- is an incredibly powerful tool, that should be made available to everybody now and for the next thirty years. However, he argues that this power is hard to access because we have a lot of "adequate" tooling for a fantastic language. He adds that we need superb tooling in order for "beginners to learn faster, and experienced programmers to accelerate productivity". He refers to overhauling programming environments to reflect data flow across and within functions and to "setting higher standards" for ourselves so that we finally "lift ourselves out of the pit of adequacy". He also asks whether it is really necessary to be in "crisis mode" to invest into fundamental changes of the ecosystem (escalation of commitment). I deeply agree, and wanted to thank Andrew for his talk. Personally, to me it seemed that Andrew was more referring to the programming environment than to the set of available libraries, although this may not have been his intention. I wanted to express that in my opinion, and in order to drive Haskell forward, we need a qualitative, reliable, performant, and well maintained "standard" library. Haskell has many "adequate" libraries but few superb ones. And while I understand that the immense advantages of being an open source community also come at a cost: a lot of work we do in our free time, because we have limited funding for improving libraries or development environments. Many libraries are maintained by one individual who may have the time and resources (or not) to look at standing issues or possibilities for improvements. In particular, I am a mathematician/statistician working in evolutionary biology. I work with multivariate distributions (hardly any of those are readily available on Hackage), I work with a lot of random numbers (the support for random sampling is mediocre, at best; 'splitmix' is standard by now but not supported by the most important statistics library of Haskell), I work with numerical optimization (I envy Pythonians for their libraries, although I still prefer Haskell because what I achieve, at least I get right), I work with Markov chains (yes, I had to write my own MCMC library in order to run proper Markov chains), I need to plot my data (there is no superb standard plotting library available in Haskell). By now, I do maintain library packages providing answers to some of these problems, but it was (and is) a lot of work. Finally, I want to thank all library developers for their impressive work, thank you! And still, I think it is not enough. In my opinion, these are all examples where Haskell needs to improve if we want to broaden the adoption among the general public. Do we have the resources? Thank you! Dominik

Let me second this. I've worked on a few different supply chain problems over the last five years and Haskell *as a language* would have been perfect—if it only had more libraries. I did do some work in Haskell (which was great), but it just didn't make sense in other areas :/. Modeling problems have a combination of mathematical and domain-specific complexity that lets Haskell's types and expressiveness really shine. Functional programming is a natural fit for this area; most of the Numpy/Python code I see at work sticks to immutable operations. It would not be much of an exaggeration to call Pandas a purely functional DSL embedded in Python. I taught Haskell to one of my colleagues with a PhD in operations research, and he said it was the first time a programming language matched how he thought about his work. Tooling wouldn't necessarily be a bottleneck either. I still believe that friendly and stable tooling is important, but the state of Python tooling—somehow less consistent and more confusing than Haskell's tools—shows that people will work around tooling issues if everything else falls into place. Missing foundational libraries? Not so much. On Sat, Sep 11, 2021 at 10:53 PM Dominik Schrempf < dominik.schrempf@gmail.com> wrote:
Hi!
I am referring to the impressive talk/show/thought-provoking message about "The Historical Futurism of Haskell" delivered by Andrew Boardman at the Haskell Love Conference. I am not sure if I am allowed to link to the talk myself, since it is password protected, and was part of a conference. Maybe this can be done by the conference organizers, or by Andrew himself, if they please.
In summary, and please add your comments or correct me, Andrew discusses that we truly believe that Haskell --- as a statically typed, lazy, pure, and functional language -- is an incredibly powerful tool, that should be made available to everybody now and for the next thirty years. However, he argues that this power is hard to access because we have a lot of "adequate" tooling for a fantastic language. He adds that we need superb tooling in order for "beginners to learn faster, and experienced programmers to accelerate productivity". He refers to overhauling programming environments to reflect data flow across and within functions and to "setting higher standards" for ourselves so that we finally "lift ourselves out of the pit of adequacy". He also asks whether it is really necessary to be in "crisis mode" to invest into fundamental changes of the ecosystem (escalation of commitment).
I deeply agree, and wanted to thank Andrew for his talk. Personally, to me it seemed that Andrew was more referring to the programming environment than to the set of available libraries, although this may not have been his intention. I wanted to express that in my opinion, and in order to drive Haskell forward, we need a qualitative, reliable, performant, and well maintained "standard" library. Haskell has many "adequate" libraries but few superb ones. And while I understand that the immense advantages of being an open source community also come at a cost: a lot of work we do in our free time, because we have limited funding for improving libraries or development environments. Many libraries are maintained by one individual who may have the time and resources (or not) to look at standing issues or possibilities for improvements.
In particular, I am a mathematician/statistician working in evolutionary biology. I work with multivariate distributions (hardly any of those are readily available on Hackage), I work with a lot of random numbers (the support for random sampling is mediocre, at best; 'splitmix' is standard by now but not supported by the most important statistics library of Haskell), I work with numerical optimization (I envy Pythonians for their libraries, although I still prefer Haskell because what I achieve, at least I get right), I work with Markov chains (yes, I had to write my own MCMC library in order to run proper Markov chains), I need to plot my data (there is no superb standard plotting library available in Haskell). By now, I do maintain library packages providing answers to some of these problems, but it was (and is) a lot of work.
Finally, I want to thank all library developers for their impressive work, thank you! And still, I think it is not enough. In my opinion, these are all examples where Haskell needs to improve if we want to broaden the adoption among the general public. Do we have the resources?
Thank you! Dominik _______________________________________________ Haskell-Cafe mailing list To (un)subscribe, modify options or view archives go to: http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe Only members subscribed via the mailman list are allowed to post.

On Sun, 12 Sep 2021, Dominik Schrempf wrote:
I work with Markov chains (yes, I had to write my own MCMC library in order to run proper Markov chains),
I don't know what proper Markov chains are, at least I wrote my own simple lazy Markov chain generator years ago that fulfilled my needs: https://hackage.haskell.org/package/markov-chain I have also written a package for Hidden Markov models: https://hackage.haskell.org/package/hmm-lapack
I need to plot my data (there is no superb standard plotting library available in Haskell). By now, I do maintain library packages providing answers to some of these problems, but it was (and is) a lot of work.
If you write packages that fulfill the needs of your applications, that's certainly better than writing impressive frameworks where no one knows whether it is usable, at all. :-)

Henning Thielemann
On Sun, 12 Sep 2021, Dominik Schrempf wrote:
I work with Markov chains (yes, I had to write my own MCMC library in order to run proper Markov chains),
I don't know what proper Markov chains are, at least I wrote my own simple lazy Markov chain generator years ago that fulfilled my needs: https://hackage.haskell.org/package/markov-chain
I have also written a package for Hidden Markov models: https://hackage.haskell.org/package/hmm-lapack
Thank you for mentioning this. I was playing around with 'markov-chain' some time ago! I was referring for something more generic, for example, where the transition rate/probability matrices can be directly defined. I am not so familiar with hidden Markov models. With respect to the term "proper", thanks for picking that up :); I should have been more specific: I was referring to Markov chain Monte Carlo samplers. For a reference implementation, please see the 'mcmc' library in R [1], for a feature-rich one, e.g., 'PyMC3' [2].
I need to plot my data (there is no superb standard plotting library available in Haskell). By now, I do maintain library packages providing answers to some of these problems, but it was (and is) a lot of work.
If you write packages that fulfill the needs of your applications, that's certainly better than writing impressive frameworks where no one knows whether it is usable, at all. :-)
You are right, we want to avoid impressive but unusable frameworks. I was referring to the following question: "For a given goal, which features do I have to implement myself in order achieve the goal?" With the existing set of Haskell libraries, I make the following observation too often (or too early in my stack of requirements): "This feature is not available (or not readily available), I need to implement it myself." For example, at the moment I am working on a library for estimating covariance matrices from sample data. This is not at all my area of expertise. In Python and R, well maintained state-of-the-art methods are readily available (e.g., shrinkage based estimators such as the Ledoit-Wolf estimator, or oracle approximating shrinkage, as well as estimators based on Gaussian graphical models such as the graphical lasso --- actually, there is a glasso library on Hackage but it seems unmaintained, lacks documentation, and improvements from newer findings in the last 10 years). I hope I made myself clear. I don't want to naysay. I really enjoy Haskell per se! I am a big fan. Dominik [1] https://mran.microsoft.com/snapshot/2015-02-27/web/packages/mcmc/index.html [2] https://docs.pymc.io/

I love Haskell, as probably anybody who has finally understood “how to cook it”. Took me 3 attempts to really get it though - and my experience is quite similar to a lot of developers with imperative background. I’ve worked with large teams of developers both in the enterprise environment as well as in some 20 startups I’ve invested into. I’ve tried to push haskell on all of them, half-seriously, to try to understand why wouldn’t they use it? I would say the most popular reason by far is - very steep learning curve for any imperative programmer. Not the libraries (even if we could use more for some specific domains), not the tooling (even though I can envision an IDE for Haskell that would make development experience such a delight - imagine visualizing monad transformer stacks and function composition with types?!), not that GUI sucks (we really need a good GUI library) - but difficulty learning the concepts to become proficient. Somewhat arrogant approach of the libraries authors to the documentation - the whole “type signature *is* documentation” - doesn’t help either. So, again, I love Haskell, went a difficult road understanding how it should be taught by mastering it myself, which actually lead me to writing a book on teaching Haskell from the first principles, focusing on types, but gently and in pictures so that it’s accessible - https://leanpub.com/magicalhaskell - for which id love feedback by the way :) But we need to teach FP much better in CS schools - there are a zillion courses on Python and how Many on Haskell? We have to write more tutorials and much friendlier documentation. We have to explain real world patterns much better and with more examples - normally all Haskell docs and even examples are way too abstract, which is solid science but doesn’t help for the mass adoption. And then, yes, we need libraries and tooling. Best wishes, - A.
On 12 Sep 2021, at 18:17, Dominik Schrempf
wrote: Henning Thielemann
writes: On Sun, 12 Sep 2021, Dominik Schrempf wrote:
I work with Markov chains (yes, I had to write my own MCMC library in order to run proper Markov chains),
I don't know what proper Markov chains are, at least I wrote my own simple lazy Markov chain generator years ago that fulfilled my needs: https://hackage.haskell.org/package/markov-chain
I have also written a package for Hidden Markov models: https://hackage.haskell.org/package/hmm-lapack
Thank you for mentioning this. I was playing around with 'markov-chain' some time ago! I was referring for something more generic, for example, where the transition rate/probability matrices can be directly defined. I am not so familiar with hidden Markov models. With respect to the term "proper", thanks for picking that up :); I should have been more specific: I was referring to Markov chain Monte Carlo samplers. For a reference implementation, please see the 'mcmc' library in R [1], for a feature-rich one, e.g., 'PyMC3' [2].
I need to plot my data (there is no superb standard plotting library available in Haskell). By now, I do maintain library packages providing answers to some of these problems, but it was (and is) a lot of work.
If you write packages that fulfill the needs of your applications, that's certainly better than writing impressive frameworks where no one knows whether it is usable, at all. :-)
You are right, we want to avoid impressive but unusable frameworks. I was referring to the following question: "For a given goal, which features do I have to implement myself in order achieve the goal?" With the existing set of Haskell libraries, I make the following observation too often (or too early in my stack of requirements): "This feature is not available (or not readily available), I need to implement it myself." For example, at the moment I am working on a library for estimating covariance matrices from sample data. This is not at all my area of expertise. In Python and R, well maintained state-of-the-art methods are readily available (e.g., shrinkage based estimators such as the Ledoit-Wolf estimator, or oracle approximating shrinkage, as well as estimators based on Gaussian graphical models such as the graphical lasso --- actually, there is a glasso library on Hackage but it seems unmaintained, lacks documentation, and improvements from newer findings in the last 10 years).
I hope I made myself clear. I don't want to naysay. I really enjoy Haskell per se! I am a big fan.
Dominik
[1] https://mran.microsoft.com/snapshot/2015-02-27/web/packages/mcmc/index.html
[2] https://docs.pymc.io/ _______________________________________________ Haskell-Cafe mailing list To (un)subscribe, modify options or view archives go to: http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe Only members subscribed via the mailman list are allowed to post.
participants (4)
-
Anton Antich
-
Dominik Schrempf
-
Henning Thielemann
-
Tikhon Jelvis