RL:However, I have previously written a good deal about the design of
different types of motivation system, and my understanding of the likely
situation is that by the time we had gotten the AGI working, its
motivations would have been arranged in such a way that it would *want*
to be extremely cooperative.

You do keep saying this. An autonomous mobile agent that did not have fundamentally conflicting emotions about each and every activity and part of the world, would not succeed and survive. An AGI that trusted and cooperated with every human would not succeed and survive. Conflict is essential in a world fraught with risks, where time and effort can be wasted, essential needs can be neglected, and life and limb are under more or less continuous threat. Conflict is as fundamental and essential to living creatures and any emotional system as gravity is to the physical world. (But I can't recall any mention of it in your writings about emotions).

No one wants to be extremely cooperative with anybody. Everyone wants and needs a balance of give-and-take. (And right away, an agent's interests and emotions of giving must necessarily conflict with their emotions of taking). Anything approaching a perfect balance of interests between extremely complex creatures/ psychoeconomies with extremely complex interests, is quite impossible - hence the simply massive literature dealing with the massive reality of relationship problems. And all living creatures have them.

Obviously, living creatures can have highly cooperative and smooth relationships -but they tend to be in the small minority. Ditto relationships between humans and pets. And there is no reason to think any different odds would apply to artificial and living creatures. (Equally, extremely uncooperative, aggressive relationships also tend to be in the minority, and similar odds should apply about that).

P.S. Perhaps the balance of cooperative/uncooperative relationsbips on this forum might give representative odds?! :)




-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=70905478-e0c379

Reply via email to