Edward,

Does your estimate consider only amount of information required for
*representation*, or it also includes additional processing elements
required in neural setting to implement learning? I'm not sure 10^9 is far
off, because much more can be required for domain-independent
association/correlation catching between (subsymbolic) concepts implemented
by groups of synapses(*). Gap of 10^6 is probably about right for this
purpose, I can't see how it would be possible with, say, gap of only 10^2.

New concepts/correlations/associations can be established between events
(spikes) that are not initially aligned in any way, including different
delays in time (through axonal delays and spiking sequences), so to catch
regularities when and where they happen to appear, big enough amount of
synapse groups should be there 'on watch'.

-----
(*) By groups of synapses I mean sets of synapses that can excite a common
neuron, but single neuron can host multiple groups of synapses responsible
for multiple subsymbolic concepts. It's not neurologically grounded, just a
wild theoretic estimate.


On 10/19/07, Edward W. Porter <[EMAIL PROTECTED]> wrote:
>
>  Matt Mahoney's Thu 10/18/2007 9:15 PM post states
>
> MAHONEY>> There is possibly a 6 order of magnitude gap between the size of
> a cognitive model of human memory (10^9 bits) and the number of synapses in
> the brain (10^15), and precious little research to resolve this
> discrepancy.  In fact, these numbers are so poorly known that we aren't even
> sure there is a gap.
>
> EWP>> This gap, which Matt was so correct to highlight, is an important
> one, and points out one of the many crippling legacy of the small hardware
> mindset.
>
> EWP>> I have always been a big believer in memory based reasoning, and for
> the last 37 years I have always assumed a human level representation of
> world knowledge would require something like 10^12 to 10^14 bytes, which is
> 10^13 to 10^15 bits. (i.e., "within several orders of magnitude of the
> human brain", a phrase  I have used so many times before on this list.) My
> recollection is that after reading Minsky's reading list in 1970 and my
> taking of K-line theory to heart, the number I guessed at that time for
> world knowledge was either 10^15 bits or bytes, I forget which.  But, of
> course, my notions then were so primitive compared to what they are today.
>
> EWP>> Should we allow ourselves to think in terms of such big numbers?
> Yes.  Let's take 10^13 bytes, for example.
>
> EWP>> 10^13 bytes with 2/3s of it in non-volatile memory and 10 million
> simple RAM opp processors, capable of performing about 20 trillion random
> RAM accesses/sec, and a network with a cross-sectional bandwidth of roughly
> 45 TBytes/sec (if you ran it hot), should be manufacturable at a marginal
> cost in 7 years of about $40,000, and could be profitably sold with
> amortization of development costs for several hundred thousand dollars if
> there were a market for several thousand of them -- which there almost
> certainly would be because of their extreme power.
>
> EWP>> Why so much more than the 10^9 bits mentioned above?
>
> EWP>> Because 10^9 bits only stores roughly 1 million atoms (nodes or
> links) with proper indexing and various state values.  Anybody who thinks
> that is enough to represent human-level world knowledge in all its visual,
> audio, linguistic, tactile, kinesthetic, emotional, behavioral, and social
> complexity hasn't thought about it in sufficient depth.
>
> EWP>> For example, my foggy recollection is that Serre's representation of
> the hierarchical memory associated the portion of the visual cortext from V1
> up to the lower level of the pre-frontal cortex (from the paper I have cited
> so many times on this list) has several million pattern nodes (and, as Josh
> has pointed out, this is just for the mainly feedforward aspect of visual
> modeling).  This includes nothing for the vast majority of V1 and above, and
> nothing for audio, language, visual motion, associate cortex, prefrontal
> cortex, etc.
>
> EWP>> Matt, I am not in any way criticizing you for mentioning 10^9 bits,
> because I have read similar numbers myself, and your post pointed with very
> appropriate questioning to the gap between that and what the brain would
> appear to have the capabilility to represent.  This very low number is just
> another manifestation of the small hardware mindset that has dominated the
> conventional wisdom in the AI since its beginning.  If the only models one
> could make had to fit in the very small memories of most past machines, it
> is only natural that one's mind would be biased toward grossly simplified
> representation.
>
> EWP>> So forget the notion that 10^9 bits can represent human-level world
> knowledge. Correct me if I am wrong, but I think the memory required to
> store the representation in most current best selling video games is 10 to
> 40 times larger.
>
> Ed Porter
>
> P.S., Please give me feed back on whehter this technique of distinguishing
> original from responsive text is better than my use of all-caps, which
> received criticism.
>
> -----
> This list is sponsored by AGIRI: 
> *http://www.agiri.org/email*<http://www.agiri.org/email>
> To unsubscribe or change your options, please go to: *
> http://v2.listbox.com/member/?&* <http://v2.listbox.com/member/?&;>
>
> ------------------------------
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;
>



-- 
Vladimir Nesov                            mailto:[EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=55285786-99631c

Reply via email to