> http://www.skeptic.com/the_magazine/featured_articles/v12n02_AI_gone_awry.html

I think this paper overestimates the computational complexity of the brain.  Even if there are 10^15 synapses (there does not seem to be an accurate number published anywhere), it does not take thousands or millions of bits to represent one synapse.  In the Hopfield model, a synapse stores about one bit.  Also, neural networks are largely self regulating, in the sense of an op-amp (a high gain amplifier), which uses negative feedback to obtain a predictable and linear gain regardless of any nonlinearity or manufacturing variation of the amplifier component.  Neural networks also use feedback so that they implement the same function regardless of noise in the input, failure of a few neurons or synapses, variation over time, etc.  Arguing that we need to model the messy details of every synapse is like arguing that for one computer to simulate another, you need to model it at the transistor level.

The actual memory capacity of the brain is probably much smaller than the number of synapses, due to redundancy (for error tolerance), and fixed-weight neurons that serve only to transmit information or compute nonprogrammable functions (such as timing delays, winner take all networks, fixed pattern recognition in the retina, etc).  Landauer estimated the capacity of human long term memory at about 10^9 bits.  Also, if you consider how much information you can read in a lifetime, it is probably similar (using Shannon's estimate of the entropy of English to be 1 bit per character).  Of course, not everything you know is learned verbally, but it surely is a significant fraction.

A second problem is that the article attempts to compare the brain to software.  It estimates at least one line of code per neuron, then multiplies by 10^11, so therefore we can't simulate the brain because we can't write a working program that large.  Again, this is the wrong measure.  The "program" can't be larger than the DNA which describes it.  This is at most 6 x 10^9 bits, but probably much smaller because of all the noise (random mutations that have no effect), redundancy (multiple copies of genes), noncoding DNA (about 98%), and the fact that most of the DNA codes for things other than the brain.  The actual complexity of the "program" (not including learned information) is probably closer to 10^6 bits.

I agree with the paper on the failure of computational AI (symbolic reasoning, expert systems, etc).  There has to be a machine learning component and a rich source of data to learn from.

-- Matt Mahoney, [EMAIL PROTECTED]


----- Original Message ----
From: "Bergeson, Loren" <[EMAIL PROTECTED]>
To: [email protected]
Sent: Friday, October 6, 2006 3:39:10 PM
Subject: RE: [singularity] Counter-argument

This article takes a shot at making a counter-argument, at least if you assume that general AI is a necessary part of the Singularity:
 
 
 

From: Joshua Fox [mailto:[EMAIL PROTECTED]
Sent: Wednesday, October 04, 2006 3:16 PM
To: [email protected]
Subject: [singularity] Counter-argument

Could I offer Singularity-list readers this intellectual challenge: Give an argument supporting the thesis "Any sort of Singularity is very unlikely to occur in this century."
 
Even if you don't actually believe the point, consider it a debate-club-style challenge. If there is already something on the web somewhere, could you please point me to it.

I've been eager for this piece ever since I learned of the Singularity concept.  I know of  the "objections" chapter in Kurzweil's Singularity is Near, the relevant parts of Vinge's seminal essay, as well the ideas of Lanier, Huebner, and a few others, but in all the millions of words out there I can't remember seeing a well-reasoned article with the above claim as its major thesis.  (Note, I'm looking for "why the Singularity won't happen" rather than "why the Singularity is a bad idea" or "why technology is not accelerating".)


Joshua
This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]

------------------------------------------------------------------------------
This transmission may contain information that is privileged, confidential and/or exempt from disclosure under applicable law. If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution, or use of the information contained herein (including any reliance thereon) is STRICTLY PROHIBITED. If you received this transmission in error, please immediately contact the sender and destroy the material in its entirety, whether in electronic or hard copy format. Thank you.

==============================================================================
"EMF <idahopower.com>" made the previous annotations.

This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]


This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]


Reply via email to