Dear List,
I'm working with a friend of mine on a Neural Net library in Haskell.
There are 3 files : neuron.hs, layer.hs and net.hs.
neuron.hs defines the Neuron data type and many utility functions, all of which have been tested and work well.
layer.hs defines layer-level functions (computing the output of a whole layer of neurons, etc). Tested and working.
net.hs defines net-level functions (computing the output of a whole neural net) and the famous -- but annoying -- back-propagation algorithm.
You can find them there : http://mestan.fr/haskell/nn/html/
The problem is that here when I ask for final_net or test_output (anything after the train call, in net.hs), it seems to loop and loop around, as if it never gets the error under 0.1.
So I was just wondering if there was one or more Neural Nets and Haskell wizard in there to check the back-propagation implementation, given in net.hs, that seems to be wrong.
Thanks a lot !
--
Alp Mestan