
On a related matter, I am using Data.Binary to serialise data from
haskell
for use from other languages. The Data.Binary encoding of a Double is a
long
integer for the mantissa, and an int for the exponent. This doesn't
work too well for interacting with other languages as I'd need to have
an arbitrary precision int type there to decode/encode. The CORBA CDR
standard encodes doubles in a big ended fashion like this (excuse my
possibly incorrect ascii art):
| byte | msb lsb |
|------+---------------------------|
| 0 | S E6 E0 |
| 1 | E10 E9 E8 E7 F3 F2 F1 F0 |
| 2 | F11 F4 |
| 3 | F19 F12 |
| 4 | F27 F20 |
| 5 | F35 F28 |
| 6 | F43 F36 |
| 7 | F51 F44 |
Up until now, my code is pure haskell. Is it possible to get at the
internal bits of a Double/CDouble in ghc? Or Should I use the FFI and
write C to encode something like the above?
Tim
________________________________
From: haskell-cafe-bounces@haskell.org
[mailto:haskell-cafe-bounces@haskell.org] On Behalf Of David Leimbach
Sent: Friday, 15 May 2009 1:58 PM
To: Don Stewart
Cc: Haskell Cafe
Subject: Re: [Haskell-cafe] Data.Binary and little endian encoding
On Thu, May 14, 2009 at 8:54 PM, Don Stewart