Hi Matt,

On 9/18/06, Matt Mahoney <[EMAIL PROTECTED]> wrote:
Suppose in case 1, the AGI is smarter than humans as humans are smarter than
monkeys.  How would you convince a monkey that you are smarter than it?  How 
could
an AGI convince you, other than to demonstrate that you cannot control it?

This would depend if the extremly smart AGI would want us to know that
it is smarter than we. If it had that desire it could e.g. formulate a
proof for the Poncare Conjecture in such a way that it was as
accessible to an average person as the rules for chess.

Suppose in case 2, the AGI is more advanced than us as we are more advanced than
bacteria.  The bacteria on a microscope slide are unaware of you observing 
them, or that
you have full control over their environment.  How could you convince the 
bacteria
otherwise?

For a bacteria this will be next to impossible as it is not sentient
as has no concept of it's surroundings other than a few basic chemical
indicators.

How do you know that the universe is not a simulation by a successful AGI?

I don't ;-) But more importantly: how do you know that AGI isn't you?

--
Stefan Pernar
App. 1-6-I, Piao Home
No. 19 Jiang Tai Xi Lu
100016 Beijing
China
Mobil: +86 1391 009 1931
Skype: Stefan.Pernar

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to