
On Tue, Apr 17, 2007 at 10:32:02AM -0700, David Roundy wrote:
I'm wondering what exactly inspired the decode/encodeFloat implementation
I kind of wondered the same thing when I first saw it. Looks like it was just the quickest way to get it going.
Are there any suggestions how I could use Data.Binary to actually read a binary file full of Doubles? Should I just use the Array interface, and forget laziness and hopes of handling different-endian machines? Or is there some way to reasonably do this using Data.Binary?
I threw together a somewhat portable "longBitsToDouble" function a while ago for another project. http://darcs.brianweb.net/hsutils/src/Brianweb/Data/Float.lhs It doesn't depend on any unsafe operations or external ffi functions but it will only works on IEEE 754 machines (but that includes every machine ghc run on). It might not be fast enough for you though as it still goes via Integer in the conversion. -Brian