
7 Aug
2009
7 Aug
'09
7:49 a.m.
Krasimir Angelov wrote:
I know that I can use compression to reduce the size of the output but this will make the deserialization only slower, not faster.
Couldn't the binary package try to create a ByteString with maximal sharing directly? I.e. every ByteString gets an index, which is marked and used, whenever the index is shorter (as ByteString) than the ByteString it stands for. (A similar approach is used for shared ATerms, ie. see http://www.haskell.org/pipermail/glasgow-haskell-users/2005-December/009485....) Wouldn't that be faster than the separate compression and decompression phases and faster than reading and writing such large files? Cheers Christian