
Hello. I am a Computer Science student attempting to write an emulator using Haskell. One of my main design choices is how to deal with machine code. Clearly it is possible to represent 0's and 1's as ASCII characters, however it strikes me that it would be much nicer to the I/O using raw binary. I don't seem to be able to find much documentation on this. Does anybody know how it's done, or can point me in the direction of some resources. Many thanks Daniel

On Mon, Apr 02, 2007 at 03:26:05PM +0100, Daniel Brownridge wrote:
Hello.
I am a Computer Science student attempting to write an emulator using Haskell. One of my main design choices is how to deal with machine code. Clearly it is possible to represent 0's and 1's as ASCII characters, however it strikes me that it would be much nicer to the I/O using raw binary. I don't seem to be able to find much documentation on this. Does anybody know how it's done, or can point me in the direction of some resources.
The current Big Name in Haskell's binary support is the aptly named 'binary' library, available from hackagedb (http://hackage.haskell.org/cgi-bin/hackage-scripts/package/binary-0.3). binary works using two sets of functions, one for very efficiently building binary bytestrings (like [Char] -> [Char]): data Builder --abstract empty, append -- monoid ops singleton :: Word8 -> Builder putWord16be :: Word8 -> Builder ... toLazyByteString :: Builder -> ByteString and a monad for parsing binary data: data Get a -- abstract getWord8 :: Get Word8 getWord16be :: Get Word16 ... runGet :: Get a -> ByteString -> a (there's also a higher level interface paterned on Read/Show, but I don't think that's applicable here). Stefan

On 02/04/2007, at 16:26, Daniel Brownridge wrote:
Hello.
I am a Computer Science student attempting to write an emulator using Haskell. One of my main design choices is how to deal with machine code. Clearly it is possible to represent 0's and 1's as ASCII characters, however it strikes me that it would be much nicer to the I/O using raw binary. I don't seem to be able to find much documentation on this. Does anybody know how it's done, or can point me in the direction of some resources.
Imho, just read directly to an Array and work with that. Probably you want to look at the OmegaGB Gameboy Emulator project for examples. http://www.mutantlemon.com/omegagb The code for loading ROM images to an Array of Words: http://darcs.mutantlemon.com/omegagb/src/RomImage.hs After that, opcodes are easily parsed by pattern matching on the hexadecimal values, e.g. see the mcti function in the Cpu module: http://darcs.mutantlemon.com/omegagb/src/Cpu.hs It would be nicer to write a Data.Binary instance for the Instruction datatype and use that to do the parsing, but I don't think that loading ROM files is a major speed concern here. Another interesting resource you may want to look at for your emulator code can be ICFPC'06 Universal Machine implementations. Don Stewart has a page with a few highly performant implementations (and there are benchmarks too, yay!): http://www.cse.unsw.edu.au/~dons/um.html Cheers pepe

Hello Daniel, Monday, April 2, 2007, 6:26:05 PM, you wrote:
however it strikes me that it would be much nicer to the I/O using raw binary. I don't seem to be able to find much documentation on this.
it's our secret weapon ;) http://haskell.org/haskellwiki/Library/Streams http://haskell.org/haskellwiki/Library/AltBinary i'm biased, though, because it's my creation ;) Binary library and hGetArray/hPutArray/hGetBuf/hPutBuf available, too -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com
participants (4)
-
Bulat Ziganshin
-
Daniel Brownridge
-
Pepe Iborra
-
Stefan O'Rear