Mike Tintner wrote:
RL:However, I have previously written a good deal about the design of
different types of motivation system, and my understanding of the likely
situation is that by the time we had gotten the AGI working, its
motivations would have been arranged in such a way that it would *want*
to be extremely cooperative.
You do keep saying this. An autonomous mobile agent that did not have
fundamentally conflicting emotions about each and every activity and
part of the world, would not succeed and survive. An AGI that trusted
and cooperated with every human would not succeed and survive. Conflict
is essential in a world fraught with risks, where time and effort can be
wasted, essential needs can be neglected, and life and limb are under
more or less continuous threat. Conflict is as fundamental and essential
to living creatures and any emotional system as gravity is to the
physical world. (But I can't recall any mention of it in your writings
about emotions).
I think the way to resolve your questions is to analyze each one for
hidden assumptions.
First: you mention "emotions" many times, but I talk about
*motivations*, not emotions. The thing we call "emotions" is closely
related, but it is by no means the same, and it confuses the issues a
lot to talk about one when we should be talking about the other.
For example, I am writing this in a cafe, and after finishing the above
paragraph I reached around and picked up my bagel and took a bit. Why
did I do that? My motivation system has a set of things that are its
current goals, and it did a smooth switch from having the [write down my
thoughts] motivation in control to having the [take a bite of food]
motivation in control.
[takes sip of tea]
But I feel no emotions at the moment. [bite]. Although, a short time
ago I was trying to drive up a steep hill here in town, to get to an
orchestra rehearsal, and had to abandon the attempt when it turned out
that the road had a layer of ice underneath the snow, so that people
were getting stuck on the hill, with wheels spinning. Person behind me
hit the horn when they saw me taking a long time to get turned around
safely, and hearing the horn made me feel a short burst of anger toward
the idiot: that was an emotion, but it was not a motivation, except
insomuch as I may have felt inclined to do something like turn and give
them a signal of some kind.
Motivations and emotions are linked, but it is more complex than you
paint it, and although I often talk about a "motivational/emotional
system", it is the motivational part that is causally important.
Now, your statement about how an AGI would not "succeed and survive"
unless it felt conflict ... you are talking about creatures that use the
standard biological design for a motivational system, which are forced
to compete against other creatures using nothing but tooth and claw and
wits.
It is important to see that all of the conditions that force biological
creatures to have to (a) use only the standard motivational system that
nature designed, and (b) compete with other creatures using the same
motivation system in an evolutionary context, DO NOT APPLY to AGI systems.
This is such an obvious point that i find it difficult to begin
explaining it. There are no selection pressures, no breeding, no
limited lifespan, no conflict for food when there is the ability to
engineer as much as necessary. There would quite likely be only one, or
a limited number of AGIs on the whole planet, with no uncontrolled
growth in their numbers.....
On and on the list goes. Every one of the factors that would cause a
situation in which an AGI had to compete in order to "succeed and
survive" are missing.
The fundamental mistake (which many, many people make) is to simply
assume that when an AGI is built, it will be dropped into the current
design of world without substantially changing it: this is what I have
called the "Everything just the same, but with robots" scenario. It
makes no sense. They would change the world - immediately - so as to
make all that compete to succeed nonsense a thing of the past.
In the rest of your text, below, I will just highlight the places where
you do the same thing:
No one wants to be extremely cooperative with anybody.
Of course: people are built with motivational systems specifically
designed to NOT be especially cooperative. So what?
Everyone wants
and needs a balance of give-and-take. (And right away, an agent's
interests and emotions of giving must necessarily conflict with their
emotions of taking).
An "agent's"? What agent? Designed with what kind of motivational
system? And did you assume a human-similar one?
Anything approaching a perfect balance of interests
"Balance of interests"? Sounds like an evolutionary pressure kind of
idea.... what evolution would that be?
between extremely complex creatures/ psychoeconomies with extremely
complex interests, is quite impossible
Not true even with humans, under the right circumstances, but I won't
argue that. Certainly doe snot apply to AGI systems.
- hence the simply massive
literature dealing with the massive reality of relationship problems.
And all living creatures have them.
Sure they do: same old story, though - they have the same design of
motivational system.
Obviously, living creatures can have highly cooperative and smooth
relationships -but they tend to be in the small minority. Ditto
relationships between humans and pets.
The [humans are to pets] as [AGIs are to humans] analogy is such a silly
one, regardless of how often it is repeated: I notice you sneak this
idea in here.
And there is no reason to think
any different odds would apply to artificial and living creatures.
(Equally, extremely uncooperative, aggressive relationships also tend to
be in the minority, and similar odds should apply about that).
P.S. Perhaps the balance of cooperative/uncooperative relationsbips on
this forum might give representative odds?! :)
Totally irrelevant. We are all human.
Richard Loosemore
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=71148209-d3b35b