
Clean-like _explicit_ uniqueness typing is not what I'm asking for in Haskell.
So you want implicit, automatically inferred uniqueness typing - something that would be even more fragile and sensitive then current Haskell's space problems arising from laziness? ;-)
Why should inferring uniqueness be all that fragile? A uniqueness checker can be rather robust, as is demonstrated by the Clean one.
Fragile could refer to the fact that a relatively small looking change to your code could have a enormous impact on the runtime of the code because you unknowningly changed a value from being used uniquely to being used non-uniquely. In clean, the annotations allow you to enforce the uniqueness, so this change would be caught by the type-checker. But, if the uniqueness is *only* inferred, then the user has to be very careful about ensuring uniqueness if they want performance gains associated with it -- and they have to do it without the help of the type-checker. Having written a bit of clean code, I can say that it is very easy to accidently un-uniquify things. Jeremy Shaw.