On Tue Feb 17 2015 at 11:50:20 PM Roman Cheplyaka <roma@ro-che.info> wrote:
Note that foldr (+) 0 has nothing to do with laziness — it's eager. Its
problem is that it's not tail-recursive.
The problem is that when you say foldr (+) 0 [1, 2, 3, 4, 5] you build a thunk "1 + (2 + (3 + (4 + (5 + 0))))", which has - let's call it "whnf evaluation depth" of 4 (you need a stack of size 4 to evaluate it to whnf).

I would like a type system that would disallow building thunks with *unbounded* whnf evaluation depth.

I'm thinking we could annotate every type in every type signature with this depth. Let us use a special kind "@" for these annotations, allow polymorphism on them, and use "t@d" to denote "a value of type t requiring whnf evaluation depth d".

(+) :: forall (a:@) (b:@) . Int@a -> Int@b -> Int@(1+max(a,b));

($) :: forall (a:@) (b:@->@) . (forall (a:@) (b:@->@) . x@a -> y@(b a)) -> x@a -> y@(b a)

The type signature for (.) quickly gets unwieldy and I wasn't able to even write down the signature for foldr :) but perhaps you get the idea.

Some informal properties this would have:
* whnf depth of calling a function in a tailcall position is max(whnf depth of the function itself, whnf depth of its argument).
* whnf depth of "case x of { ... }" is 1.
* In the context of "case x of { ...(primitive pattern)... }", whnf depth of x is 0.
 
Does this make any sense?


foldl (+) 0 would be an example of a laziness issue.

If you want to specify which functions should or shouldn't be used in a
lazy context, take a look at polarity (see Harper's Practical
Foundations for Programming Languages). So you could say, for instance,
that it doesn't make sense to use + in a lazy context; that'd eliminate
half the cases of laziness-induced memory leaks in practice.

If instead you want to impose some upper bound on the memory consumption
without caring about specific cases, then see this ICFP'12 paper:
http://www.dcc.fc.up.pt/~pbv/AALazyExtended.pdf

On 18/02/15 07:04, Eugene Kirpichov wrote:
> Hello haskell-cafe,
>
> Let me repost here a question I posted to cstheory stackexchange - in
> hopes that there are more type theory experts here.
>
> http://cstheory.stackexchange.com/questions/29518/type-systems-preventing-laziness-related-memory-leaks
>
> Perhaps the main source of performance problems in Haskell is when a
> program inadvertently builds up a thunk of unbounded depth - this causes
> both a memory leak and a potential stack overflow when evaluating. The
> classic example is defining sum = foldr (+) 0 in Haskell.
>
> Are there any type systems which statically enforce lack of such thunks
> in a program using a lazy language?
>
> Seems like this should be on the same order of difficulty as proving
> other static program properties using type system extensions, e.g. some
> flavors of thread safety or memory safety.