More precisely, the human brain writes language and images into short term
conscious memory at 10 bits per second. NY times article:

https://click.aaas.sciencepubs.org/?qs=6d3629a4113cca416839e1820a074b9b73ae7a6d2d63d050c2856a2e196113ea7f5a906e9664f1923fa9488de52ed1ade67e246e4015892f

Should redirect to
https://www.nytimes.com/2024/12/26/science/speed-of-thought.html
without the paywall.

Abstract: https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Landauer in the 1980s estimated human long term memory capacity at 10^9
bits. This implies that we forget 90% of what we learn after 30 years (10^9
seconds).

The retina produces about 10^9 bps per eye, which is compressed to 10^7 bps
at the optic nerve (10^6 fibers at 10 bps each). This implies a compression
ratio of 10^-8 for raw images and 10^-6 at the brain.

The human brain has 6 x 10^14 synapses. Why does it need 600,000 synapses
to store 1 bit? A Hopfield associative memory (a symmetric neural network
with no hidden layer) stores 0.3 bits per parameter (0.15 bits per
connection). I have two theories to explain this discrepancy but not much
confidence in either one. One is that the vast majority of human memory is
implicit or unconscious, but we don't know how to measure it. This would be
stuff like the knowledge to recognize faces or ride a bicycle. The second
is that synapses require a lot of redundancy to make very small weight
changes when individual synapses are binary, either connected or not. In my
own language models I use learning rates around 0.001 on weights stored
with 20 bits of precision. When reading, I can ignore the 10 low bits
before losing prediction accuracy. To do this with binary weights and
probabilistic updates I think I would need 10^6 times as many.

In theory it should be possible to train a human level LLM without vision
on 1 GB of text (10^9 bits compressed) and store it using 3B parameters.
Current LLMs are larger because they are far beyond human, understanding
hundreds of languages.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T599834c0d9f7e4d4-Mc86bed21d29eefc66b31a99c
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to