
On 2017-09-21 at 14:24:14 +0200, Niklas Hambüchen wrote: [...]
I argue that `fromIntegral` causes unnoticed data corruption without any warning, and thus are worse than partial functions.
[...]
So I propose that we add a deprecation pragma to `fromIntegral`
Tbh, given how ubiquitous the use of `fromIntegral` is throughout the Haskell ecosystem, having 'fromIntegral' suddely emit warnings when `-Wall` is active is not realistic IMHO. You'd have to introduce a separate (opt-in) category of WARNING pragmas for that (something some of us would also like to see for the category of partial functions) which isn't implied by -Wall.
(however, keeping it forever, like Java does, as there's no benefit to removing it, we just want that people use safer functions over time), and instead provide a `safeFromIntegral` that can only widen ranges, and one or more `unsafeFromIntegral` or appropriately named functions that can error, wrap, return Maybe, or have whatever desired behaviour when the input value doesn't fit into the output type.
[...]
It avoids real problems in real production systems that are difficult to debug when they happen (who writes a unit test checking that a timeout of 50 days still works? This is the kind of stuff where you have to rely on -- very basic -- types).
I ran into this very problem early on. I'll use the opportunity to shamelessly plug an older package of mine probably only few know about which solved this very problem in mission critical systems for me while avoiding to have to enumerate O(n^2) transitions explicitly: http://hackage.haskell.org/package/int-cast-0.1.2.0/docs/Data-IntCast.html But it isn't plain Haskell 2010... Another related package which may be useful in this context is http://hackage.haskell.org/package/safeint which helps avoid silent integer overflows during arithemtic operations.