John G. Rose wrote:
From: Richard Loosemore [mailto:[EMAIL PROTECTED]

John LaMuth wrote:
Reality check ***

Consciousness is an emergent spectrum of subjectivity spanning 600
mill.
years of
evolution involving mega-trillions of competing organisms, probably
selecting
for obscure quantum effects/efficiencies....

Our puny engineering/coding efforts could never approach this - not
even
in a million years.

An outwardly pragmatic language simulation, however, is very do-able.

John LaMuth
It is not.

And we can.


I thought what he said was a good description more or less. Out of 600
millions years there may be only a fraction of that which is an improvement
but it's still there.

How do you know, beyond a reasonable doubt, that any other being is
conscious?

The problem is, you have to nail down exactly what you *mean* by the word "conscious" before you start asking questions or making statements. Once you start reading about and thinking about all the attempts that have been made to get specific about it, some interesting new answers to simple questions like this begin to emerge.

What I am fighting here is a tendency for some people to use wave-of-the-hand definitions that only capture a fraction of a percent of the real meaning of the term. And sometimes not even that.


At some point you have to trust that others are conscious, in the same
species, you bring them into your recursive "loop" of consciousness
component mix.

A primary component of consciousness is a self definition. Conscious
experience is unique to the possessor. It is more than a belief that the
possessor herself is conscious but others who appear conscious may be just
that, appearing to be conscious. Though at some point there is enough
feedback between individuals and/or a group to share consciousness
experience.

Still though, is it really necessary for an AGI to be conscious? Except for
delivering warm fuzzies to the creators? Doesn't that complicate things?
Shouldn't the machines/computers be slaves to man? Or will they be
equal/superior. It's a dog-eat-dog world out there.

One of the main conclusions of the paper I am writing now is that you will (almost certainly) have no choice in the matter, because a sufficiently powerful type of AGI will be conscious whether you like it or not.

The question of "slavery" is completely orthogonal.



I just want things to be taken care of and no issues. Consciousness brings
issues. Intelligence and consciousness are separate.


Back to my first paragraph above: until you have thought carefully about what you mean by consciousness, and have figured out where it comes from, you can't really make a definitive statement like that, surely?

And besides, the wanting to have things taken care of bit is a separate issue. That is not a problem, either way.


Richard Loosemore


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com

Reply via email to