Re: [singularity] AI concerns

2007-06-30 Thread Samantha Atkins
Alan Grimes wrote: Available computing power doesn't yet match that of the human brain, but I see your point, What makes you so sure of that? It has been computed countless times here and elsewhere that I am sure you are aware of so why do you ask? - This list is sponsored by AG

Re: [singularity] AI concerns

2007-06-30 Thread Samantha Atkins
Charles D Hixson wrote: Stathis Papaioannou wrote: Available computing power doesn't yet match that of the human brain, but I see your point, software (in general) isn't getting better nearly as quickly as hardware is getting better. Well, not at the personally accessible level. I understand

Re: [singularity] AI concerns

2007-06-30 Thread Samantha Atkins
Sergey A. Novitsky wrote: Dear all, Perhaps, the questions below were already touched numerous times in the past. Could someone kindly point to discussion threads and/or articles where these concerns were addressed or discussed? Kind regards, Serge --

Re: [singularity] AI concerns

2007-06-30 Thread Tom McCabe
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > On 01/07/07, Tom McCabe <[EMAIL PROTECTED]> > wrote: > > > > Why do you assume that "win at any cost" is the > > > default around which > > > you need to work? > > > > Because it corresponds to the behavior of the > vast, > > vast majority of

Re: [singularity] AI concerns

2007-06-30 Thread Stathis Papaioannou
On 01/07/07, Alan Grimes <[EMAIL PROTECTED]> wrote: > Available computing power doesn't yet match that of the human brain, > but I see your point, What makes you so sure of that? What's the latest estimate of the processing capacity of the human brain as compared to that of available computer

Re: [singularity] AI concerns

2007-06-30 Thread Stathis Papaioannou
On 01/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote: > Why do you assume that "win at any cost" is the > default around which > you need to work? Because it corresponds to the behavior of the vast, vast majority of possible AGI systems. Is there a single AGI design now in existence which wouldn't

Re: [singularity] AI concerns

2007-06-30 Thread Tom McCabe
What does Vista have to do with hardware development? Vista merely exploits hardware; it doesn't build it. If you want to measure hardware progress, you can just use some benchmarking program; you don't have to use OS hardware requirements as a proxy. - Tom --- Charles D Hixson <[EMAIL PROTECTED

Re: [singularity] AI concerns

2007-06-30 Thread Tom McCabe
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > On 01/07/07, Tom McCabe <[EMAIL PROTECTED]> > wrote: > > > > But Deep Blue wouldn't try to poison Kasparov in > > > order to win the > > > game. This isn't because it isn't intelligent > enough > > > to figure out > > > that disabling your opp

Re: [singularity] AI concerns

2007-06-30 Thread Alan Grimes
> Available computing power doesn't yet match that of the human brain, > but I see your point, What makes you so sure of that? -- Opera: Sing it loud! :o( )>-< - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listb

Re: [singularity] AI concerns

2007-06-30 Thread Charles D Hixson
Stathis Papaioannou wrote: On 01/07/07, Alan Grimes <[EMAIL PROTECTED]> wrote: For the last several years, the limiting factor has absolutely not been hardware. How much hardware do you claim you need to devel a hard AI? Available computing power doesn't yet match that of the human brain, but

Re: [singularity] AI concerns

2007-06-30 Thread Stathis Papaioannou
On 01/07/07, Alan Grimes <[EMAIL PROTECTED]> wrote: For the last several years, the limiting factor has absolutely not been hardware. How much hardware do you claim you need to devel a hard AI? Available computing power doesn't yet match that of the human brain, but I see your point, software

Re: [singularity] AI concerns

2007-06-30 Thread Stathis Papaioannou
On 01/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote: > But Deep Blue wouldn't try to poison Kasparov in > order to win the > game. This isn't because it isn't intelligent enough > to figure out > that disabling your opponent would be helpful, it's > because the > problem it is applying its intelli

Re: [singularity] AI concerns

2007-06-30 Thread Alan Grimes
Stathis Papaioannou wrote: >> If AI is going to be super-intelligent, it may be treated by >> governments as >> some sort of super-weapon. >> As it already happened with nuclear weapons, there may be treaties >> constraining AI development. > Nuclear weapons need a lot of capital and resources to

Re: [singularity] AI concerns

2007-06-30 Thread Tom McCabe
More like trying to stop nuclear annihilation if before the discovery of the fission chain reaction, everything from your car to your toaster had parts built out of solid U-235. - Tom --- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > On 01/07/07, Sergey A. Novitsky > <[EMAIL PROTECTED]> wrot

Re: [singularity] AI concerns

2007-06-30 Thread Tom McCabe
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > On 01/07/07, Tom McCabe <[EMAIL PROTECTED]> > wrote: > > > An excellent analogy to a superintelligent AGI is > a > > really good chess-playing computer program. The > > computer program doesn't realize you're there, it > > doesn't know you're

Re: [singularity] AI concerns

2007-06-30 Thread Stathis Papaioannou
On 01/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote: An excellent analogy to a superintelligent AGI is a really good chess-playing computer program. The computer program doesn't realize you're there, it doesn't know you're human, it doesn't even know what the heck a human is, and it would gladly p

Re: [singularity] AI concerns

2007-06-30 Thread Stathis Papaioannou
On 01/07/07, Sergey A. Novitsky <[EMAIL PROTECTED]> wrote: If AI is going to be super-intelligent, it may be treated by governments as some sort of super-weapon. As it already happened with nuclear weapons, there may be treaties constraining AI development. Nuclear weapons need a lot of capita

Re: [singularity] AI concerns

2007-06-30 Thread Tom McCabe
--- "Sergey A. Novitsky" <[EMAIL PROTECTED]> wrote: > Dear all, > > Perhaps, the questions below were already touched > numerous times in the > past. > > Could someone kindly point to discussion threads > and/or articles where these > concerns were addressed or discussed? > > > > Kind regar

[singularity] AI concerns

2007-06-30 Thread Sergey A. Novitsky
Dear all, Perhaps, the questions below were already touched numerous times in the past. Could someone kindly point to discussion threads and/or articles where these concerns were addressed or discussed? Kind regards, Serge