On 10/24/06, Russell Wallace <[EMAIL PROTECTED]> wrote:
On 10/24/06, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> I know Hugo de Garis pretty well personally, and I can tell you that
> he is certainly not "loony" on a personal level, as a human being.
> He's a bit eccentric, but he actually has a very solid understanding
> of the everyday world as well as of many branches of science....  His
> "reality discrimination" faculty is excellent, which discriminates him
> from the loonies of the world...  He is however a bit of a showman --
> it can be striking how his persona changes when he shifts from private
> conversation to speaking on-camera or in front of a crowd...

 This is one occasion on which I must agree with Eliezer that you're just a
bit too charitable :P

Well, I don't want to have a public debate about a friend's psyche ;-) ...

Anyway, "loony" is not a very precisely defined term, in general.  It
can refer either to social behaviors or to underlying cognitions, for
example -- and there are many different cognitive patterns that can
lead to apparently "loony" social behaviors....  To be honest, I
thought Hugo was a bit loony based on some of his public
presentations, before I got to know him well in person and understood
his patterns of thinking....

This is not to say, however, that I fully agree with either his
specific futurist predictions or his judgment of the viability of
various paths to AGI.

> -- Kurzweil: ultratech becomes part of all of our lives, so much that
> we take it for granted, and the transition from the human to posthuman
> era is seamless

 That scenario I think is plausible, though I don't share Kurzweil's
optimism regarding either its inevitability or imminence.

Do you think that De Garis's scenario of a massive violent conflict
between pro and anti Singularity forces is not plausible?

Ben

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to