On Jan 28, 2008, at 12:03 PM, Richard Loosemore wrote:
Your comments below are unfounded, and all the worse for being so poisonously phrased. If you read the conversation from the beginning you will discover why: Matt initially suggested the idea that an AGI might be asked to develop a virus of maximum potential, for purposes of testing a security system, and that it might respond by inserting an entire AGI system into the virus, since this would give the virus its maximum potential. The thrust of my reply was that his entire idea of Matt's made no sense, since the AGI could not be a "general" intelligence if it could not see the full implications of the request.

Please feel free to accuse me of gross breaches of rhetorical etiquette, but if you do, please make sure first that I really have committed the crimes. ;-)

I notice everyone else has (probably wisely) ignored
my response anyway.

I thought I'd done well at removing the most "poisonously
phrased" parts of my email before sending, but I agree I
should have waiting a few hours and revisited it before
sending, even so.  In any case, changes in meaning due to
sloppy copying of others' arguments are just SOP for most
internet arguments these days.  :(

To bring this slightly back to AGI:

The thrust of my reply was that his entire idea of Matt's made no sense, since the AGI could not be a "general" intelligence if it could not see the full implications of the request.

I'm sure you know that most humans fail to see the full
implications of *most* things.  Is it your opinion, then,
that a human is not a general intelligence?

--
Randall Randall <[EMAIL PROTECTED]>
"If I can do it in Alabama, then I'm fairly certain you
 can get away with it anywhere." -- Dresden Codak



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=90632569-c873ac

Reply via email to