--- Richard Loosemore <[EMAIL PROTECTED]> wrote: > Matt Mahoney wrote: > > --- Richard Loosemore <[EMAIL PROTECTED]> wrote: > >> Matt Mahoney wrote: > >>> I was referring to Landauer's estimate of long term memory learning rate > >> of > >>> about 2 bits per second. http://www.merkle.com/humanMemory.html > >>> This does not include procedural memory, things like visual perception > and > >>> knowing how to walk. So 10^-6 bits is low. But how do we measure such > >>> things? > >> I think my general point is that "bits per second" or "bits per synapse" > >> is a valid measure if you care about something like an electrical signal > >> line, but is just simply an incoherent way to talk about the memory > >> capacity of the human brain. > >> > >> Saying "0.000001 bits per synapse" is no better than opening and closing > >> one's mouth without saying anything. > > > > "Bits" is a perfectly sensible measure of information. Memory can be > measured > > using human recall tests, just as Shannon used human prediction tests to > > estimate the information capacity of natural language text. The question > is > > important to anyone who needs to allocate a hardware budget for an AI > design. > > If I take possession of a brand new computer with 1 terabyte hard-drive > memory capacity, and then I happen to use it to store nothing but the > (say) 1 gigabyte software it came with, your conclusion would be that > each memory cell in the computer has a capacity of 0.001 bits. > > This is a meaningless number because *how* the storage is actually being > used is not a sensible measure of its capacity. > > So, knowing that humans actually recall X amount of bits in the Shannon > sense does not tell you how many "bits per synapse" are stored in the > brain, it just tells you .... that humans recall X amount of bits in the > Shannon sense, that is all.
Landauer used tests like having people look at thousands of photos, then tested them by having them look at more photos (some seen before, some novel) and asking if they have seen them before. He did similar tests with words, numbers, music clips, etc. and in every case the learning rate was around 2 bits per second. His tests were similar to those done by Standing, who had subjects look at up to 10,000 photos (one every 5 seconds) and take a recall test 2 days later with about 80% accuracy (vs. 71% for lists of random words). This is the result you would get if each picture or word was encoded with about 14 or 15 bits. It would be interesting to conduct similar tests for procedural memory. (How many bits of code do you need to ride a bicycle?) But I doubt it would explain all of the 10^6 discrepancy. In any case, 10^9 bits is what Turing estimated in 1950, and it's how much language a person is exposed to in a couple of decades. I think it's a useful number to keep in mind for building AI, but in my experience in language modeling for compression, you will probably need a lot more memory if you want reasonable performance. Apparently the brain does too. References Landauer, Tom (1986), "How much do people remember? Some estimates of the quantity of learned information in long term memory", Cognitive Science (10) pp. 477-493. Standing, L. (1973), "Learning 10,000 Pictures", Quarterly Journal of Experimental Psychology (25) pp. 207-222. Shannon, Claude E. (1950), "Prediction and Entropy of Printed English", Bell Sys. Tech. J (3) p. 50-64. Turing, A. M., (1950) "Computing Machinery and Intelligence", Mind, 59:433-460. -- Matt Mahoney, [EMAIL PROTECTED] ------------------------------------------- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com/member/?member_id=4007604&id_secret=96140713-a54b2b Powered by Listbox: http://www.listbox.com
