Dear Douglas,

Could you tell me then how I could present a network with one input-output pair 
a time, and train incrementally?

I found out that I should set train(cont=1)

But this is as far as I get right now. If I train my net on this pair (=1 
backprop step) and then offer the next one, the network is resetting. This is 
because it assumes that my single pair is the whole data set and that it resets 
itself after each   n (here=1) epoch because the learning criterium was not 
reached.

I now what catastrophic forgetting is, but in what I want to do this will not 
play a role.

Regards,
Dieter

Douglas S. Blank wrote:
Jose,

Conx doesn't have a C API---most everything is in Python (except for the
matrix multiplication).

There is lots of support for incremental learning, as that is mostly what
the developers use conx for! We train robots, incrementally, on-line.

I suggests you try some of the easier fixes first. But if you want to read
more about a research topic on catastrophic forgetting, you can read the
paper about the governor here:

http://mightymouse.brynmawr.edu/~dblank/papers/sab04.pdf

and see the code here:

pyrobot/brain/governor.py

-Doug

On Fri, October 26, 2007 6:46 am, Dieter Vanderelst said:
Hi,

Yes. You point out exactly what I want.

It unfortunate that the conx module has no incremental learning.

Anyway, for now I will have to come up with a work-around.

Regards,
Dieter

Jose Antonio Martin H wrote:
Dear Dieter,  I have experienced the same exact problem with conx, that
is the reason by wich I do not use conx.

It seems that the C API of conx does not have an implementation of
incremental learning for neural networks, this seems to be strange but I
cant find the correct fucntions for doing this.

If anybody has the WAY to the that , please inform to all of us both to
this list of Pyro and please send a note to the list of Conx users
somewhere.

THNKS.
jose


----- Original Message ----- From: "Dieter Vanderelst"
<[EMAIL PROTECTED]>
To: "Douglas S. Blank" <[EMAIL PROTECTED]>;
<pyro-users@pyrorobotics.org>
Sent: Friday, October 26, 2007 12:26 PM
Subject: Re: [Pyro-users] Using the SRN


Dear Douglas,

Thank you for your answer.
I have programmed a net based on your pointers. But I still have some
troubles.

This is what I do:

I use the code at http://pyrorobotics.org/?page=SRNModuleExperiments
to make an elman net.

Then I want to train this net by setting *single* input and output
pattern repetitively:

for each input_vector en output_vector:
network.setInputs([input_vector])
network.setOutputs([output_vector])
network.train() #train the network some *more* on each pass


Is this possible? It seems like the net is resetting itself after each
call of train since it considers each pass trough this loop a an
epoch? Can this resetting be switched off?


Regards,
Dieter



Douglas S. Blank wrote:
Dieter,

You can use as long of sequences as you want, even from a file.

See, for example, the section on off-line learning here:
http://pyrorobotics.org/?page=Building_20Neural_20Networks_20using_20Conx

or
http://pyrorobotics.org/?page=Robot_20Learning_20using_20Neural_20Networks


You can use the loadDataFromFile or loadInputsFromFile /
loadtargetsFromFile.

If you want to look at hidden layer activations, perhaps the easiest
method would be to use the SRN.propagate(input=[0,1,0,0,1]) form, and
then look at the hidden layer. For example:

srn = SRN()
# .. add layers, train
srn.propagate(input=[0,1,0,0,1])
print srn["hidden"].activation

Another way would be to extend the SRN class and override one of the
methods, like postBackprop:

from pyrobot.brain.conx import *
class MySRN(SRN):
   def postBackprop(self, **args):
       print self["hidden"].activation
       SRN.postBackprop(self, **args)

and use the MySRN class exactly the way that you would the SRN class.
That would allow you to examine the hidden layer during processing.

You can set batch to 0 and you shouldn't have any problem, either way.

Hope that helps,

-Doug

Dieter Vanderelst wrote:
Hi,

I need some advise on the use off SRN (simple recurrent nets).

I know what the network does but I need some help on the Pyro
implementation.
This is what I want to do with the net:
-I want to train a SRN using a single (very long) sequence of
patterns. The examples I could find on SRN all define a number of
patterns and build a sequence of these on the fly. However, I will
read a single long sequence of patterns from a file (experimental
data).

-Second, I want to analyze the activation of the hidden nodes in
response to each different input pattern. To this, I want present
the net ad random with a long sequence of input patterns and
register the activations.

-I don't want the network to be trained using batch updating. Given
my problem, batch updating is senseless.

So, could somebody assist me in finding the best settings for this
kind of requirements?

Thanks,
Dieter Vanderelst

_______________________________________________
Pyro-users mailing list
Pyro-users@pyrorobotics.org
http://emergent.brynmawr.edu/mailman/listinfo/pyro-users

_______________________________________________
Pyro-users mailing list
Pyro-users@pyrorobotics.org
http://emergent.brynmawr.edu/mailman/listinfo/pyro-users

_______________________________________________
Pyro-users mailing list
Pyro-users@pyrorobotics.org
http://emergent.brynmawr.edu/mailman/listinfo/pyro-users




_______________________________________________
Pyro-users mailing list
Pyro-users@pyrorobotics.org
http://emergent.brynmawr.edu/mailman/listinfo/pyro-users

Reply via email to