* The lack of implicit conversions (except for the weird defaulting of
literals, which means that I am constantly writing `fromIntegral` and
`toRealFrac` in places where there is only one reasonable choice of
type conversion, and occasionally having things just malfunction
because I didn't quite understand what these conversion functions
would give me as a result.
Prelude> 3 + 3.5
6.5
Prelude> let x = 3
Prelude> x + 3.5
<interactive>:4:5:
No instance for (Fractional Integer) arising from the literal `3.5'
Possible fix: add an instance declaration for (Fractional Integer)
In the second argument of `(+)', namely `3.5'
In the expression: x + 3.5
In an equation for `it': it = x + 3.5
Prelude>
I mean, seriously? We expect newbies to just roll with this kind of thing?
ghci turns no NoMonomorphismRestriction on newer GHCs.
To me that issue is largely independent of changes to the numeric hierarchy, though I confess I'd largely tuned out this thread and am just now skimming backwards a bit.
That said, implicit conversions are something where personally I feel Haskell does take the right approach. It is the only language I have access to where I can reason about the semantics of (+) sanely and extend the set of types.
Even worse, the same sort of thing happens when trying to add a
`Data.Word.Word` to an `Integer`. This is a totally safe conversion if
you just let the result be `Integer`.
* The inability of Haskell to handle unary negation sanely, which
means that I and newbies alike are constantly having to figure things
out and parenthesize. From my observations of students, this is a huge
barrier to Haskell adoption: people who can't write 3 + -5 just give
up on a language. (I love the current error message here, "cannot mix
`+' [infixl 6] and prefix `-' [infixl 6] in the same infix
expression", which is about as self-diagnosing of a language failing
as any error message I've ever seen.)
* The multiplicity of exponentiation functions, one of which looks
exactly like C's XOR operator, which I've watched trip up newbies a
bunch of times. (Indeed, NumericPrelude seems to have added more of
these, including the IMHO poorly-named (^-) which has nothing to do
with numeric negation as far as I can tell. See "unary negation"
above.)
* The incredible awkwardness of hex/octal/binary input handling, which
requires digging a function with an odd and awkward return convention
(`readHex`) out of an oddly-chosen module (or rolling my own) in order
to read a hex value. (Output would be just as bad except for
`Text.Printf` as a safety hatch.) Lord knows what you're supposed to
do if your number might have a C-style base specifier on the front,
other than the obvious ugly brute-force thing?
* Defaulting numeric types with "-Wall" on producing scary warnings.
Prelude> 3 + 3
<interactive>:2:3: Warning:
Defaulting the following constraint(s) to type `Integer'
(Num a0) arising from a use of `+'
In the expression: 3 + 3
In an equation for `it': it = 3 + 3
<interactive>:2:3: Warning:
Defaulting the following constraint(s) to type `Integer'
(Num a0) arising from a use of `+' at <interactive>:2:3
(Show a0) arising from a use of `print' at <interactive>:2:1-5
In the expression: 3 + 3
In an equation for `it': it = 3 + 3
6
and similarly for 3.0 + 3.0. If you can't even write simple addition
without turning off or ignoring warnings, well, I dunno. Something.
Oh, and try to get rid of those warnings. The only ways I know are `3
+ 3 :: Integer` or `(3 :: Integer) + 3`, both of which make code read
like a bad joke.
>>> 3 + 3
>>> :set -Wall -fno-warn-type-defaults
>>> 3 == 3