Walter Bright:

>is that biology achieves reliability by using redundancy, not by requiring 
>individual components to be perfect. The redundancy goes down to the DNA 
>level, even. Another way is it uses quantity, rather than quality. Many 
>organisms produce millions of offspring in the hope that one or two survive.<

Quantity is a form of redundancy.

In biology reliability is achieved in many ways.
Your genetic code is degenerated, so many single letter mutations lead to no 
mutated proteins. This leads to neural mutations.

Bones and tendons use redundancy too to be resilient, but they aren't flat 
organizations, they are hierarchies of structures inside larger structures, at 
all levels, from the molecular level up; this allows for a different kind of 
failures, like in earthquakes (many small ones, few large ones, with a power 
law distribution).

A protein is a chain of small parts, and its function is partially determined 
by its form. This forms is mostly self-created. But once in a while few other 
proteins help shape up the other proteins, especially when the temperature is 
too much high.

Most biological systems are able to self-repair, that usually means cells that 
die and duplicate and sometimes they build harder structures like bones. This 
happens at a sub-cellular level too, cells have many systems to repair and 
clean themselves, they keep destroying and rebuilding their parts at all 
levels, and you can see it among neurons too: your memory is encoded (among 
other things) by the connections between neurons, but they die. So new 
connections among even very old neurons can be created, and they replace the 
missing wiring, keeping the distributed memory functional even 100 years after 
the events, in very old people.

Genetic information is encoded in multiple copies, and sometimes in bacteria 
distribuited in the population. Reliability is necessary when you copy or read 
the genetic information, this comes from a balance from the energy used to copy 
and how much reliable you want such read/copy, and how much fast you want it 
(actually ribosomes and DNA polymerase are about on the theoretical minimum of 
this 3-variable optimization, you can't do better even in theory).

Control systems, like those in the brain, seek reliability in several different 
ways. One of them is encoding vectors in a small population of neurons. The 
final direction of where your finger points is found by such vectorized 
average. Parkinson's disease can kill 90% of the cells in certain zones, yet I 
can keep being able to move the hand to grab a glass of water (a little shaky, 
because the average is computed on much less vectors).

There is enough stuff to write more than one science popularization article :-)


>how would you write a program that would be expected to survive having a 
>random bit in it flipped at random intervals?<

That's a nice question. The program and all its data is stored somewhere, 
usually on RAM, caches, and registers. How can you use a program if bits in 
your RAM can flip at random with a certain (low) probability? There are 
error-correction RAM memories, based on redundancy codes, like Reed-Solomon 
one. ECC memory is today common enough. Similar error correction schemes can be 
added to inner parts of the CPU too (and probably someone has done it, for 
example in CPUs that must work in space on satellites, where the Sun radiation 
is not shielded by the earth atmosphere).
I am sure related schemes can be used to test if a CPU instruction has done its 
purpose of if during its execution something has gone wrong. You can fix such 
things in hardware too.

But there are other solutions beside fixing all errors. Today chips keep 
getting smaller, and power for each transistor keeps going down. Eventually 
noise and errors will start to grow. Recently some people have realized that on 
the screen of a mobile telephone you can tolerate few wrongly decompressed 
pixels from a video, if this allows the chip to use only 1/10 of the normal 
energy used. Sometimes you want few wrong pixels here and there if they allow 
you to keep seeing videos on your mobile telephone for twice long. In future 
CPUs will probably become less reliable, so they software (mostly operating 
system, I think) will need to invent ways to fix those errors. This will allow 
to keep programs globally reliable even with fast low powered CPUs. 
Molecular-scale adders will need software to fix their errors. Eventually this 
is going to become more and more like cellular biochemistry, with all its 
active redundancy :-)

There's no end to the amount of things you can say on this topic.

Bye,
bearophile

Reply via email to