
Hello Andrew, Saturday, June 30, 2007, 11:48:19 AM, you wrote:
me too :) but i never used Haskell for compression itself, only for managing archives. fast compression routines are written in C++
What, you're telling me that "fast" software cannot be written in Haskell? :-P
in my program, managing archives isn't time-critical part. compression requires 90-99% of total execution time
Well anyway, speed is not my aim. My aim is to play with various textbook algorithms an examine how well they work for various types of data. As long as the code isn't *absurdly* slow that'll be just fine.
yes, for studying algorithms Haskell is perfect tool
(I forget who it was, but a while back somebody pointed out the wisdom of using Data.Map. It certainly makes the LZW implementation about 400% faster! To say nothing of Huffman...)
it seems that you've done first implementation with lists or immutable arrays? :)
BTW, how do the pros do Huffman coding? Presumably not by traversing trees of pointer-linked nodes to process things 1 bit at a time...
encoding is simple - make this traversal one time and store bit sequence for each symbol. for decoding, you can find length of longest symbol MaxBits and build table of 2^MaxBits elements which allows to find next symbol by direct lookup with next input MaxBits. faster algorithms use two-level lookup, first lookup with 9-10 bits is best when your encoded symbols are bytes -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com