
There's one part of this alternative proposal I don't understand:
On Mon, Apr 21, 2014 at 5:04 AM, Edward Kmett
* If you can compile sans warnings you have nothing to fear. If you do get warnings, you can know precisely what types will have degraded back to the old precision at *compile* time, not runtime.
I don't understand the mechanism by which this happens (maybe I'm misunderstanding the MINIMAL pragma?). If a module has e.g.
import DodgyFloat (DodgyFloat) -- defined in a 3rd-party package, doesn't implement log1p etc.
x = log1p 1e-10 :: DodgyFloat
I don't understand why this would generate a warning (i.e. I don't believe it will generate a warning). So the user is in the same situation as with the original proposal. John L.
On Mon, Apr 21, 2014 at 5:24 AM, Aleksey Khudyakov < alexey.skladnoy@gmail.com> wrote:
I was just wondering, why not simply numerically robust algorithms as defaults for these functions? No crashes, no errors, no loss of
everything would just work. They aren't particularly complicated, so
On 21 April 2014 09:38, John Lato
wrote: precision, the performance should even be reasonable.
I think it's best option. log1p and exp1m come with guarantees about precision. log(1+p) default makes it impossible to depend in such guarantees. They will silenly give wrong answer