
Dear Cafe, since a little discussion with my tutor I am wondering how the following problem can be solved and if it is decidable: Given the definition of a recursive function f in, say, haskell, determine if f can be implemented in O(1) memory. First I thought the solution would be "check if f is tail-recursive". However, consider the following definition:
-- for the sake of uniformity if' c t e = if c then t else e
f :: (Num a) => a -> a -> a f x y = if' (x <= 0) y $ if' (x >= 2) (f (x-2) (y+1)) (f (x-1) y)
(I don't really care what this function computes as long as it terminates, which is obvious) Although ghc will probably not perform this optimization, f can be realized in O(1) mem: // trying to duplicate f's structure as closely as possible double f( double x, double y ) { START: if ( x <= 0 ) return y; else { if ( x >= 2 ) { x = x - 2; y = y + 1; goto START; } else { x = x - 1; y = y; goto START; } } } It is crucial that (the second) if' does not use both of its last two arguments, but only one. If we replace the second if' by, say
g :: (Num a) => c -> a -> a -> a g c t e = if' c (t + e) (t - e)
, then we have to compute *both* (f (x-2) (y+1)) and (f (x-1) y), and x and y have to be kept in memory during the call to (f (x-2) (y+1)), therefore f cannot be implemented in constant memory. (read: I haven't found a way which does not radically alter f's structure). So, does someone know how to solve this or can prove that it can't be solved? Best regards, Steffen