Ben, I hope you are going to keep a human in the
loop.
Human in the loop scenario:
The alpha Novamente makes a suggestion about some
change to its software.
The human implements the change on the beta
Novamente running on a separate machine, and tests it.
If it seems to be an improvement, it is
incorporated into the alpha Novamente.
Human not in the loop scenario:
The Novamente looks at its code.
The Novamente makes changes to its code, and
reboots itself.
The Novamente looks at its code.
The Novamente makes changes to its code, and
reboots itself.
The Novamente looks at its code.
The Novamente makes changes to its code, and
reboots itself.
The humans wonder what the hell is going on.
Mike Deering. To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] |
- Re: [agi] Teaching AI's to self-modify deering
- RE: [agi] Teaching AI's to self-modify Ben Goertzel
- Re: [agi] Teaching AI's to self-modify Ben Goertzel
- RE: [agi] Teaching AI's to self-modify Ophir Shai
- RE: [agi] Teaching AI's to self-modify Ben Goertzel