
On Sat, Mar 22, 2014 at 8:58 PM, Bart Massey
* The lack of implicit conversions (except for the weird defaulting of literals, which means that I am constantly writing `fromIntegral` and `toRealFrac` in places where there is only one reasonable choice of type conversion, and occasionally having things just malfunction because I didn't quite understand what these conversion functions would give me as a result.
Prelude> 3 + 3.5 6.5 Prelude> let x = 3 Prelude> x + 3.5
<interactive>:4:5: No instance for (Fractional Integer) arising from the literal `3.5' Possible fix: add an instance declaration for (Fractional Integer) In the second argument of `(+)', namely `3.5' In the expression: x + 3.5 In an equation for `it': it = x + 3.5 Prelude>
I mean, seriously? We expect newbies to just roll with this kind of thing?
Actually, we don't expect them to roll with it any more.
let x = 3 x + 3.5 6.5
ghci turns no NoMonomorphismRestriction on newer GHCs. To me that issue is largely independent of changes to the numeric hierarchy, though I confess I'd largely tuned out this thread and am just now skimming backwards a bit. That said, implicit conversions are something where personally I feel Haskell does take the right approach. It is the only language I have access to where I can reason about the semantics of (+) sanely and extend the set of types.
Even worse, the same sort of thing happens when trying to add a `Data.Word.Word` to an `Integer`. This is a totally safe conversion if you just let the result be `Integer`.
The problem arises when you allow for users to extend the set of numeric types like Haskell does. We have a richer menagerie of exotic numerical types than any other language, explicitly because of our adherence to a stricter discipline and moving the coercion down to the literal rather than up to every function application. Because of that type inference works better. It can flow both forward and backwards through (+), whereas the approach you advocate is strictly less powerful. You have to give up overloading of numeric literals, and in essence this gives up on the flexibility of the numerical tower to handle open sets of new numerical types. As someone who works with compensated arithmetic, automatic differentiaton, arbitrary precision floating point, interval arithmetic, Taylor models, and all sorts of other numerical types in Haskell, basically you're almost asking me to give up all the things that work in this language to go back to a scheme style fixed numerical tower.
* The inability of Haskell to handle unary negation sanely, which means that I and newbies alike are constantly having to figure things out and parenthesize. From my observations of students, this is a huge barrier to Haskell adoption: people who can't write 3 + -5 just give up on a language. (I love the current error message here, "cannot mix `+' [infixl 6] and prefix `-' [infixl 6] in the same infix expression", which is about as self-diagnosing of a language failing as any error message I've ever seen.)
That is probably fixable by getting creative in the language grammar. I note it particularly because our Haskell like language Ermine here at work gets it right. ;)
* The multiplicity of exponentiation functions, one of which looks exactly like C's XOR operator, which I've watched trip up newbies a bunch of times. (Indeed, NumericPrelude seems to have added more of these, including the IMHO poorly-named (^-) which has nothing to do with numeric negation as far as I can tell. See "unary negation" above.)
It is unfortunate, but there really is a distinction being made. * The incredible awkwardness of hex/octal/binary input handling, which
requires digging a function with an odd and awkward return convention (`readHex`) out of an oddly-chosen module (or rolling my own) in order to read a hex value. (Output would be just as bad except for `Text.Printf` as a safety hatch.) Lord knows what you're supposed to do if your number might have a C-style base specifier on the front, other than the obvious ugly brute-force thing?
A lot of people these days turn to lens for that:
:m + Numeric.Lens Control.Lens "7b" ^? hex Just 123 hex # 123 "7b"
* Defaulting numeric types with "-Wall" on producing scary warnings.
Prelude> 3 + 3
<interactive>:2:3: Warning: Defaulting the following constraint(s) to type `Integer' (Num a0) arising from a use of `+' In the expression: 3 + 3 In an equation for `it': it = 3 + 3
<interactive>:2:3: Warning: Defaulting the following constraint(s) to type `Integer' (Num a0) arising from a use of `+' at <interactive>:2:3 (Show a0) arising from a use of `print' at <interactive>:2:1-5 In the expression: 3 + 3 In an equation for `it': it = 3 + 3 6
and similarly for 3.0 + 3.0. If you can't even write simple addition without turning off or ignoring warnings, well, I dunno. Something. Oh, and try to get rid of those warnings. The only ways I know are `3 + 3 :: Integer` or `(3 :: Integer) + 3`, both of which make code read like a bad joke.
This is no longer a problem at ghci due to NMR:
3 + 3 6
You can of course use
:set -Wall -fno-warn-type-defaults instead of -Wall for cases like
3 == 3 where the type doesn't get picked.
-Edward