
On 02/04/2007, at 16:26, Daniel Brownridge wrote:
Hello.
I am a Computer Science student attempting to write an emulator using Haskell. One of my main design choices is how to deal with machine code. Clearly it is possible to represent 0's and 1's as ASCII characters, however it strikes me that it would be much nicer to the I/O using raw binary. I don't seem to be able to find much documentation on this. Does anybody know how it's done, or can point me in the direction of some resources.
Imho, just read directly to an Array and work with that. Probably you want to look at the OmegaGB Gameboy Emulator project for examples. http://www.mutantlemon.com/omegagb The code for loading ROM images to an Array of Words: http://darcs.mutantlemon.com/omegagb/src/RomImage.hs After that, opcodes are easily parsed by pattern matching on the hexadecimal values, e.g. see the mcti function in the Cpu module: http://darcs.mutantlemon.com/omegagb/src/Cpu.hs It would be nicer to write a Data.Binary instance for the Instruction datatype and use that to do the parsing, but I don't think that loading ROM files is a major speed concern here. Another interesting resource you may want to look at for your emulator code can be ICFPC'06 Universal Machine implementations. Don Stewart has a page with a few highly performant implementations (and there are benchmarks too, yay!): http://www.cse.unsw.edu.au/~dons/um.html Cheers pepe