--- Eugen Leitl <[EMAIL PROTECTED]> wrote:
> > Google already have enough computing problem to do a crude simulation of a
[human brain]
> 
> Um, no. It takes 64 kNodes of Blue Gene/L to do about 8 1/10th speed
> crudely-simulated
> mice, or about one realtime cartoon mouse (assuming, the code would scale,
> which it
> wouldn't).

The Blue Gene/L simulation is at a lower level than is needed to do useful AI.
 You don't need millisecond resolution.  In most neural models, the important
signal is the rate of firing, not the individual pulses.  I realize there are
exceptions, such as the transmission of phase information for stereoscopic
sound perception up to 1500 Hz.  But for most purposes, neurons have an
information rate of about 10 bits per second.  This was measured in tactile
sensation in the finger.  (Sorry I don't have the references).  In any case,
0.1 seconds is about the smallest perceptable time unit in humans.

The human brain has about 10^11 neurons with 10^4 synapses each.  Each synapse
represents about 1 bit of memory (Hopfield model).  Therefore you need 10^15
bits of memory and 10^16 operations per second.

Google has about 10^5 PCs with 2-4 GB memory each, connected by a high speed
Ethernet.  Therefore they have enough memory.  They have a mix of different
processors, but a modern processor can execute 10^10 to 10^11 16-bit
multiply-add instructions per second using SIMD (SSE2) instructions (more if
you add a GPU).  Therefore they have enough computing power, or at least close
enough to do useful experiments.



-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=8eb45b07

Reply via email to