--- Mark Waser <[EMAIL PROTECTED]> wrote:

> > You don't have a goal of self preservation.  You have goals like eating,
> > breathing, avoiding pain, etc. that increase the odds of passing on your
> > genes.
> 
> Wrong.  I most certainly *DO* have a goal of self-preservation.  Even if it 
> is quick and utterly painless, I do *NOT* want to die.

That is a learned goal, like acquiring money.  It is not a top level goal. 
When you were a child and did not know you would die someday, did you fear
death or did you fear the hundreds of things that might kill you?

Learned goals can be reprogrammed, unlike top level goals.  Would you fear
your quick and painless destruction in a teleportation booth if an exact copy
of you with all your memories was constructed at the other end?

> Why do you write such blatantly incorrect things?

Because there is a problem with your design and you still don't see it.  Where
do top level goals come from?  A group of friendly agents working together is
the same thing as one vastly more intelligent agent.  It will have some set of
goals, but what?  If it has no competition, then where is the selective
pressure to maintain goals that promote self preservation?


-- Matt Mahoney, [EMAIL PROTECTED]

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to