
#9304: Floating point woes; Different behavior on Mac vs Linux ------------------------------------+------------------------------------- Reporter: lerkok | Owner: Type: bug | Status: new Priority: high | Milestone: Component: Compiler | Version: 7.8.3 Keywords: floating point | Operating System: Unknown/Multiple Architecture: Unknown/Multiple | Type of failure: None/Unknown Difficulty: Unknown | Test Case: Blocked By: | Blocking: Related Tickets: | ------------------------------------+------------------------------------- I've the following snippet: {{{ x, y, r :: Double x = -4.4 y = 2.4999999999999956 r = x * y }}} Using GHC 7.8.3, on a Mac, I get the following response from ghci: {{{ *Main> decodeFloat r (-6192449487634421,-49) }}} Using GHC 7.8.3, on a Linux machine, I get the following response: {{{ *Main> decodeFloat r (-6192449487634422,-49) }}} Note that off-by-one difference in the first component of the output. I'm not 100% sure as to which one is actually correct; but the point is that these are IEEE floating point numbers running on the same architecture (Intel X86), and thus should decode in precisely the same way. While I observed this with 7.8.3; I don't think this is a new regression; I suspect it'll show up in older versions as well. Also, for full disclosure: I ran the Mac version naively; but the Linux version on top of a VirtualBox image. I'd very much doubt that would make a difference, but there might be 32/64-bit concerns. So, if someone can validate the Linux output on a true 64-bit Linux machine, that would really help track down the issue further. -- Ticket URL: http://ghc.haskell.org/trac/ghc/ticket/9304 GHC http://www.haskell.org/ghc/ The Glasgow Haskell Compiler