
On Wed, Aug 30, 2006 at 06:04:47PM -0400, Lennart Augustsson wrote:
On Aug 30, 2006, at 14:58 , David Roundy wrote:
The trouble here is that ghci is printing more digits than it really ought to be printing.
No, I don't think it is. Ghci is printing the number that is closest of all numbers in decimal notation to the Double in question (i.e., 0.1+0.2). Printing it with fewer decimals would yield a different number if it was read back.
Then I guess the problem is that the output of show isn't appropriate for human interaction? In many cases it's nice for (read . show) to be identity, but I don't prefer this for floating point numbers. If I want a bitwise accurate output of a Double, I'll dump a binary. When I output decimal, I generally want something friendly to the human who's reading it, which to me means outputting only significant digits (which is admittedly an ill-defined concept). -- David Roundy