
2 Apr
2007
2 Apr
'07
10:26 a.m.
Hello. I am a Computer Science student attempting to write an emulator using Haskell. One of my main design choices is how to deal with machine code. Clearly it is possible to represent 0's and 1's as ASCII characters, however it strikes me that it would be much nicer to the I/O using raw binary. I don't seem to be able to find much documentation on this. Does anybody know how it's done, or can point me in the direction of some resources. Many thanks Daniel