Charles D Hixson wrote:
Samantha Atkins wrote:
Sergey A. Novitsky wrote:

Dear all,

...

          o Be deprived of free will or be given limited free will (if
            such a concept is applicable to AI).

See above, no effective means of control.

- samantha

There is *one* effective means of control:
An AI will only do what it wants to do. If, during the construction, you control what those desires are, you control the AI to the extent that it can be controlled.
I know we have both been in discussions where this was exploded as equivalent to no real control. Attempting to pre-compute all possible implications and evolutions of a set of supergoals, especially if the AGI is capable of reasonably full introspection, is not tractable.
Once it is complete, then you can't change what you have already done. If it's much more intelligent than you, you probably also can't constrain it, contain it, or delete it. If for no other reason than that it will hide it's intent from you until it can succeed.

So it's very important that you not get anything seriously wrong WRT the desires and goals of the AI as you are building it. The other parts are less significant, as they will be subject to change...not necessarily by you, but rather by the AI itself.
Its goals, if it can fully introspect, are subject to change and modification as well.

E.g.: If you instruct the AI to have "reverence for life" as defined by Albert Schweitzer, then we are likely to end up a planet populated by the maximal number of microbes. (Depends on exactly how this gets interpreted...perhaps a solar system or galaxy populated by the maximal number of microbes.)

Well, English is fuzzy. You knew what you meant, and you had the best of intentions... And the AI did precisely what it was built to do. You just couldn't revise it when the bugs in the design became clear.
Yep. So it better be able to figure out the implications of the half-assed crap the monkey minds set up as its goals.

- samantha

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to