I don't think the problem is with trainNetwork, but rather epoch.  You might try adjusting your network datatype and sub datatypes in the manner of data Network = Network !Int !Int until you narrow down which piece is causing the problem.

On Mon, Jan 26, 2015 at 7:19 AM, Hans Georg Schaathun <georg+haskell@schaathun.net> wrote:
Hi,

can someone give some hints on how to get around a stack space
overflow?

My problem is with the training function for a neural network:

trainNetwork :: Double -> Samples -> Int -> Network -> Network
trainNetwork _ _ 0 n = n
trainNetwork eta samples c n = trainNetwork eta samples (c-1) $!
                             epoch eta n samples
epoch :: Double -> Network -> Samples -> Network

So trainNetwork runs epoch c times, each time taking a Network
in and modifying the Network as output.  Clearly, space complexity
can be made constant in c, but I get stack overflow if and only
if c is too large.

As you can see, I have tried to make the epoch evaluation strict
($!).  I have also tried bang patterns on the input parameter n,
and I have tried rewriting with foldr/foldl/foldl', and I have
tried switchin the inner and outer calls (epoch vs. trainNetwork),
all to no avail.

I reckon this loop like pattern should be fairly common ...
does it have a common solution too?

TIA
--
:-- Hans Georg
_______________________________________________
Beginners mailing list
Beginners@haskell.org
http://www.haskell.org/mailman/listinfo/beginners