
Brian Hurt
But I was wondering if it is possible that lazy evaluation could lead to space compression, especially under heavily persistant usage patterns?
Note that the benefit isn't *big*- we're talking about 40 words of memory when the main data structure is taking up 5K plus words of memory- so it's less than 1% different. But there is a (small) upside in memory usage at least occassionally, right?
Actually, a lazy evaluation strategy can sometimes change the entire complexity class of space usage, not just the constant factors. For instance, lazy streaming algorithms (where the data is produced and consumed in lock-step) may use a small constant amount of memory, independent of the size of the data, whereas an eager strategy would use memory linearly proportional to the dataset size. Regards, Malcolm