
Maurício wrote:
But why would you want that? I understand the only situation when talking about number of bytes makes sense is when you are using Foreign and Ptr. (...)
Because I'm using both Ptr and Foreign? ;)
See my recent announcement for bytestring-trie. One of the optimizations I'm working on is to read off a full natural word at a time, (...)
I see, you mean the size of a machine word, not of Data.Word.
AFAIK, Data.Word.Word is defined to be "the same size as Prelude.Int" (which it isn't on GHC 6.8.2 on Intel OS X: 32bits vs 31bits) and Int is defined to be at least 31bits but can be more. My interpretation of this is that Int and Word will generally be implemented by the architecture's natural word size in order to optimize performance, much like C's "int" and "unsigned int" but with better definition of allowed sizes. This seems to be supported by the existence of definite-sized variants Word8, Word16, Word32... So yeah, I'm meaning the machine word, but I think Word is intended to proxy for that. Maybe I'm wrong, but provided that Word contains (or can be persuaded to contain) a round number of Word8 and that operations on Word are cheaper than the analogous sequence of operations on the Word8 representation, that's good enough for my needs. -- Live well, ~wren