On Mon, Apr 02, 2007 at 03:26:05PM +0100, Daniel Brownridge wrote: > Hello. > > I am a Computer Science student attempting to write an emulator using > Haskell. > One of my main design choices is how to deal with machine code. > Clearly it is possible to represent 0's and 1's as ASCII characters, > however it strikes me that it would be much nicer to the I/O using raw > binary. I don't seem to be able to find much documentation on this. > Does anybody know how it's done, or can point me in the direction of > some resources.
The current Big Name in Haskell's binary support is the aptly named 'binary' library, available from hackagedb (http://hackage.haskell.org/cgi-bin/hackage-scripts/package/binary-0.3). binary works using two sets of functions, one for very efficiently building binary bytestrings (like [Char] -> [Char]): data Builder --abstract empty, append -- monoid ops singleton :: Word8 -> Builder putWord16be :: Word8 -> Builder ... toLazyByteString :: Builder -> ByteString and a monad for parsing binary data: data Get a -- abstract getWord8 :: Get Word8 getWord16be :: Get Word16 ... runGet :: Get a -> ByteString -> a (there's also a higher level interface paterned on Read/Show, but I don't think that's applicable here). Stefan _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe