
----- Original Message ----- From: "Jan-Willem Maessen - jmaessen@alum.mit.edu" Sent: Wednesday, December 07, 2005 2:21 PM
Wearing my "Fortress language designer" hat, we've given serious thought to these techniques for very large arrays. Copying such structures is terribly expensive, or even impossible (imagine copying a 1PB array).
That's slightly beyond the size of arrays I am currently dealing with. ;) I can see that in-place updating is of utmost importance in such cases. However, I was actually thinking of something the Clean people call "garbage collection at compile time", which, as far as I can see, involves no runtime overhead at all (Clean Language Report 2.1, section 9.1. - ftp://ftp.cs.kun.nl/pub/Clean/Clean20/doc/CleanLangRep.2.1.pdf). I don't quite see why it should be necessary to specify uniqueness attributes explicitely (as it mostly is in Clean), if the type checker knows the coercion laws better than me, anyway. Hence, my question about automatically deriving uniqueness properties of tokens, to the greatest extent safely feasible at compile time. (Sorry, if this is all trivial and already implemented in ghc. As indicated, I am merely learning Haskell, and I haven't spent any mentionable time yet to understand compiler intestines.) Regards, zooloo -- No virus found in this outgoing message. Checked by AVG Free Edition. Version: 7.1.371 / Virus Database: 267.13.12/192 - Release Date: 05.12.2005