--- On Sun, 11/2/08, John G. Rose <[EMAIL PROTECTED]> wrote:

> Still though I don't agree on your initial
> numbers estimate for AGI. A bit
> high perhaps? Your numbers may be able
> to be trimmed down based on refined assumptions.

True, we can't explain why the human brain needs 10^15 synapses to store 10^9 
bits of long term memory (Landauer's estimate). Typical neural networks store 
0.15 to 0.25 bits per synapse.

I estimate a language model with 10^9 bits of complexity could be implemented 
using 10^9 to 10^10 synapses. However, time complexity is hard to estimate. A 
naive implementation would need around 10^18 to 10^19 operations to train on 1 
GB of text. However this could be sped up significantly if only a small 
fraction of neurons are active at any time.

Just looking at the speed/memory/accuracy tradeoffs of various models at 
http://cs.fit.edu/~mmahoney/compression/text.html (the 2 graphs below the main 
table), it seems that memory is more of a limitation than CPU speed. A "real 
time" language model would be allowed 10-20 years.

-- Matt Mahoney, [EMAIL PROTECTED]



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to