
On Thu, Apr 24, 2014 at 10:17 AM, Casey McCann
For default implementations, I would prefer the idea suggested earlier of a (probably slower) algorithm that does preserve precision rather than a naive version, if that's feasible. This lives up to the claimed benefit of higher precision, and as a general rule I feel that any pitfalls left for the unwary should at worst provide the correct result more slowly.
Now I'm wondering about a low-rent solution: Debug.Trace around the default so they get a runtime warning of loss of precision. (Since you can't always resolve the instance at compile time, it can't be done with a compile time pragma which would otherwise be my preferred solution.) -- brandon s allbery kf8nh sine nomine associates allbery.b@gmail.com ballbery@sinenomine.net unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net