RE: [singularity] critiques of Eliezer's views on AI

2007-06-27 Thread Papiewski, John
<[EMAIL PROTECTED]> wrote: > On 6/25/07, Papiewski, John <[EMAIL PROTECTED]> > wrote: > > > > The only way to do it is to gradually replace your > brain cells with an > > artificial substitute. > > > > We could instead anesthetize the crap out of

RE: [singularity] critiques of Eliezer's views on AI (was: Re: Personal attacks)

2007-06-25 Thread Papiewski, John
You're not misunderstanding and it is horrible. The only way to do it is to gradually replace your brain cells with an artificial substitute. You'd be barely aware that something is going on, and there wouldn't be two copies of you to be confused over. For example, imagine a medica

RE: [singularity] Bootstrapping AI

2007-06-04 Thread Papiewski, John
I disagree. If even a half-baked, partial, buggy, slow simulation of a human mind were available the captains of industry would jump on it in a second. Do you remember when no business had an automated answering service? That transition took only a few years. No, the problem is, the theore

[singularity] Bootstrapping AI

2007-06-04 Thread Papiewski, John
Of all the work and research done on intelligent AI over the past 50+ years, why do we not have software or even a description of software that simulates even childlike intelligence at a tiny percentage of realtime speed? We don't even have naive, slow AI yet, and here people are talking about

[singularity] AI and politics

2007-06-04 Thread Papiewski, John
I'm going to take a dim/skeptical view of the true potency of advanced AI here. If a hypothetical advanced AI comes up with, say, a design for a working, practical, economical, enviromentally friendly power source, will it really get anywhere? Or if it says one day that the "War on Drugs" in