On Wed, Nov 12, 2008 at 2:41 AM, John G. Rose <[EMAIL PROTECTED]> wrote:
> is it really necessary for an AGI to be conscious?

Depends on how you define it. If you think it's about feelings/qualia
then - no - you don't need that [potentially dangerous] crap + we
don't know how to implement it anyway.
If you view it as high-level built-in response mechanism (which is
supported by feelings in our brain but can/should be done differently
in AGI) then yes - you practically (but not necessarily theoretically)
need something like that for performance. If you are concerned about
self-awareness/consciousness then note that AGI can demonstrate
general problem solving without knowing anything about itself (and
about many other particular concepts). The AGI just should be able to
learn new concepts (including self), though I think some built-in
support makes sense in this particular case. BTW for the purpose of my
AGI R&D I defined self-awareness as a use of an internal
representation (IR) of self, where the IR is linked to real features
of the system. Nothing terribly complicated or mysterious about that.

>Doesn't that complicate things?

it does

> Shouldn't the machines/computers be slaves to man?

They should and it shouldn't be viewed negatively. It's nothing more
than a smart tool. Changing that would be a big mistake IMO.

>Or will they be equal/superior.

Rocks are superior to us in being hard. Cars are "superior" to us when
it comes to running fast. AGIs will be superior to us when it comes to
problem solving.
So what? Equal/superior in whatever - who cares as long as we can
progress & safely enjoy life - which is what our tools (including AGI)
are being designed to help us with.

Regards,
Jiri Jelinek


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com

Reply via email to