
13 May
2008
13 May
'08
9:12 p.m.
Aaron Denney wrote:
On 2008-05-12, Andrew Coppin
wrote: (Stupid little-endian nonsense... mutter mutter...)
I used to be a big-endian advocate, on the principle that it doesn't really matter, and it was standard network byte order. Now I'm convinced that little endian is the way to go, as bit number n should have value 2^n, byte number n should have value 256^n, and so forth.
Yes, in human to human communication there is value in having the most significant bit first. Not really true for computer-to-computer communication.
It just annoys me that the number 0x12345678 has to be transmuted into 0x78563412 just because Intel says so. Why make everything so complicated? [Oh GOD I hope I didn't just start a Holy War...]