[Haskell-cafe] Backpropagation implementation for a neural net library

2009-06-15 Thread Alp Mestan
Dear List, I'm working with a friend of mine on a Neural Net library in Haskell. There are 3 files : neuron.hs, layer.hs and net.hs. neuron.hs defines the Neuron data type and many utility functions, all of which have been tested and work well. layer.hs defines layer-level functions (computing

Re: [Haskell-cafe] Backpropagation implementation for a neural net library

2009-06-15 Thread Trin Trin
Hi Alp, - even with correctly programmed back-propagation, it is usually hard to make the net converge. - usually you initialize neuron weights with somewhat random values, when working with back-propagation. - do some debug prints of the net error while training to see how it is going - xor

Re: [Haskell-cafe] Backpropagation implementation for a neural net library

2009-06-15 Thread Alp Mestan
On Mon, Jun 15, 2009 at 5:00 PM, Trin Trin trin...@gmail.com wrote: Hi Alp, - even with correctly programmed back-propagation, it is usually hard to make the net converge. Yeah, I know, that's why we're training it until the quadratic error goes under 0.1. - usually you initialize neuron