> 
> From: Mitchell Timin <[EMAIL PROTECTED]>
> Date: 2006/08/22 ti AM 12:10:14 CEST
> To: Evolution of Artificial Neural Networks 
> <[email protected]>
> Ämne: Re: [annevolve] High neuron-count network
> 
> [EMAIL PROTECTED] wrote:
> >>> Hi!
> >>> I'm struggling to get an network with many neurons to learn a simple xor.
> >>> So far ~30 neurons is ok, but if I double that I get an network that is
> >>> more like an random number generator.
> >>>
> >>> The network characteristic is an fully connected network with a 
> >>> nonlinear output function.
> >>>
> >>> Do you have any ideas how to get an stable network that can solve xor 
> >>> and also have many number
> >>> of neurons ?
> >>>   
> >>>       
> >> XOR only requires 2 neurons if there is feedback, 3 otherwise.  If you 
> >> have lots of extra neurons the network is capable of much more complex 
> >> behaviour.  My suggestion is to increase the complexity of your fitness 
> >> function so that the network evolves toward the behaviour you want. 
> >>
> >> But why do you want 30 neurons when only 3 are required for the functions?
> >>
> >> Are you using our XOR software from the releases, or something else?
> >>     
> >
> > I'm after solving a more complex and diffuse problem. But before I get 
> > there I must have some tests to prove that an large ANN can solve a simple 
> > problem like xor.
> >
> > I have two problems with annevolve-xor:
> > - annevolve's xor can only handle fixed neuron count of 2.
> > - I'm missing an simple interface that just prints the xor-table
> >   using the best ANN in the population, and doing this for every nth epoch.
> >
> > If you agree on these problems, I could poke with the xor code.
> >
> > Anyway the original question is still more interesting, and so far I'm 
> > thinking of doing an test-run foreach ANN with fixed or random input during 
> > the population initialization phase, and those ANN that isn't stable get 
> > replaced.
> >   
> Since a small ANN can solve XOR, a large one certainly can, because it 
> might have mostly dormant neurons, and is therefore equivalent to the 
> small ANN.  You can get a dormant neuron if all of its weights and its 
> bias are very close to zero.   Or you can have an ANN that is equivalent 
> to multiple copies of the small XOR ANN, in parallel.
> 
> Don't use the ANNEvolve XOR.
> 
> Decide what type of ANN you need.  What kind of signals are the inputs?  
> What kind of signals should the outputs be?
> Will feed forward work, or is an internal state required?
> 
> m
> 

Hi
thanks for the thoughts.
I'll test the close to zero weights approach.
That also means you must use that method for every problem.

I want to apply this network to recognize sound waves.
The input is a of a stream-type (real-time or recorded, but feeded in blocks).

Output is an single neuron that says "pattern recognized" or "pattern not 
recognized"
I'll leave everything up to the network, a fully-connected ann that is only 
changed by random weight changes.



-------------------------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642

Reply via email to