Nature doesn't even have survival as its 'goal', what matters is only
survival in the past, not in the future, yet you start to describe
strategies for future survival.

Goal was in quotes for a reason. In the future, the same tautological forces will apply. Evolution will favor those things that are adapted to survive/thrive.

Nature is
stupid, so design choices left to it are biased towards keeping much
of the historical baggage and resorting to unsystematic hacks, and as
a result its products are not simply optimal survivors.

Yes, everything is co-evolving fast enough that evolution is not fast enough to produce optimum solutions. But are you stupid enough to try to fight nature and the laws of probability and physics? We can improve on nature -- but you're never going to successfully go in a totally opposite direction.

When we are talking about choice of conditions for humans to live in
(rules of society, morality), we are trying to understand what *we*
would like to choose.

What we like (including what we like to choose) was formed by evolution. Some of what we like has been overtaken by events and is no longer pro-survival but *everything* that we like has served a pro-survival purpose in the past (survival meaning survival of offspring and the species -- so altruism *IS* an evolutionarily-created "like" as well).

Better
understanding of *human* nature can help us to estimate how we will
appreciate various conditions.

Not if we can program our own appreciations. And what do we want our AGI to appreciate?

humans are very complicated things,
with a large burden of reinforcers that push us in different
directions based on idiosyncratic criteria.

Very true. So don't you want a simpler, clearer, non-contradictory set of reinforcers for you AGI (that will lead to it and you both being happy).

These reinforcers used to
line up to support survival in the past, but so what?

So . . . I'd like to create reinforcers to support my survival and freedom and that of the descendents of the human race. Don't you?



----- Original Message ----- From: "Vladimir Nesov" <[EMAIL PROTECTED]>
To: <agi@v2.listbox.com>
Sent: Wednesday, January 30, 2008 2:14 PM
Subject: Re: [agi] OpenMind, MindPixel founders both commit suicide


On Jan 29, 2008 10:28 PM, Mark Waser <[EMAIL PROTECTED]> wrote:

Ethics only becomes snarled when one is unwilling to decide/declare what the
goal of life is.

Extrapolated Volition comes down to a homunculus depending upon the
definition of wiser or saner.

Evolution has "decided" what the goal of life is . . . . but most are
unwilling to accept it (in part because most do not see it as anything other
than "nature, red in tooth and claw").

The "goal" in life is simply continuation and continuity.  Evolution goes
for continuation of species -- which has an immediate subgoal of
continuation of individuals (and sex and protection of offspring).
Continuation of individuals is best served by the construction of and
continuation of society.

If we're smart, we should decide that the goal of ethics is the continuation of society with an immediate subgoal of the will of individuals (for a large
variety of reasons -- but the most obvious and easily justified is to
prevent the defection of said individuals).

If an AGI is considered a willed individual and a member of society and has
the same ethics, life will be much easier and there will be a lot less
chance of the "Eliezer-scenario". There is no enslavement of Jupiter-brains and no elimination/suppression of "lesser" individuals in favor of "greater" individuals -- just a realization that society must promote individuals and
individuals must promote society.

Oh, and contrary to popular belief -- ethics has absolutely nothing to do
with pleasure or pain and *any* ethics based on such are doomed to failure.
Pleasure is "evolution's reward" to us when we do something that promotes
"evolution's goals". Pain is "evolution's punishment" when we do something (or have something done) that is contrary to survival, etc. And while both
can be subverted so that they don't properly indicate guidance -- in
reality, that is all that they are --> guideposts towards other goals.
Pleasure is a BAD goal because it can interfere with other goals. Avoidance
of pain (or infliction of pain) is only a good goal in that it furthers
other goals.

Mark,

Nature doesn't even have survival as its 'goal', what matters is only
survival in the past, not in the future, yet you start to describe
strategies for future survival. Yes, survival in the future is one
likely accidental property of structures that survived in the past,
but so are other properties of specific living organisms. Nature is
stupid, so design choices left to it are biased towards keeping much
of the historical baggage and resorting to unsystematic hacks, and as
a result its products are not simply optimal survivors.

When we are talking about choice of conditions for humans to live in
(rules of society, morality), we are trying to understand what *we*
would like to choose. We are doing it for ourselves. Better
understanding of *human* nature can help us to estimate how we will
appreciate various conditions. And humans are very complicated things,
with a large burden of reinforcers that push us in different
directions based on idiosyncratic criteria. These reinforcers used to
line up to support survival in the past, but so what?

--
Vladimir Nesov                            mailto:[EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=91722480-95dcc6

Reply via email to