On 10/25/2011 11:41 PM, Eugen Leitl wrote:
On Tue, Oct 25, 2011 at 10:17:24AM -0700, BGB wrote:
I was not arguing about the limits of computing, rather, IBM's specific
design.
it doesn't really realistically emulate real neurons, rather it is a
Real neurons have many features, many of them unknown, and do not
map well into solid state as is. However, you can probably find a
simplified model which is powerful and generic enough and maps
well to solid state by co-evolving substrate and representation.

well, yes, one doesn't likely need to fake all of it.
a question though is how much one needs before it does much "interesting".

if it maxes out at essentially doing simple filtering or pattern matching, this is not so compelling.


from what I can gather a simplistic "accumulate and fire" model, with
the neurons hard-wired into a grid.
In principle you can use a hybrid model by using a crossbar for
local connectivity which is analog, and a packet-switched signalling
mesh for long-range interactions, similiarly as real neurons do
it. The mesh can emulate total connectivity fine, and you can probably
even finagle something which scales better than a crossbar locally.

from what I read, IBM was using a digital crossbar.


I suspect something more "generic" would be needed.
I don't see how generic will do long-term any than for bootstrap
(above co-evolution) reasons.

more generic is more likely to be able to do something interesting.


another question is what can be done in the near term and on present
hardware (future hardware may or may not exist, but any new hardware may
take years to make it into typical end-user systems).
Boxes with large number of ARM SoCs with embedded memory and signalling
mesh have been sighted, arguably this is the way to go for large-scale.
GPGPU approaches are also quite good, if you map your neurons to a 3d
array and stream through memory sequentially. Exchanging interface state
with adjacent nodes (which can be even on GBit Ethernet) is cheap enough.

ARM or similar could work (as could a large number of 386-like cores).

I had considered something like GPGPU, but depending on the type of neural-net, there could be issues with mapping it efficiently to existing GPUs.

also, the strong-areas for GPUs are not necessarily the same as the requirements for implementing neural nets. again, it could work, but it is just less certain it is "ideal".


the second part of the question is:
assuming one can transition to a purely biology-like model, is this a
good tradeoff?...
if one gets rid of a few of the limitations of computers but gains some
of the limitations of biology, this may not be an ideal solution.
Biology had never had the issue to deal with high-performance numerics,
I'm sure if it had it wouldn't do too shabbily. You can always go hybrid
e.g. if you want to do proofs or cryptography.

biology also doesn't have software distribution, ability to make backups, ...

ideally, a silicon neural-net strategy would also readily allow the installation of new software and creation of backups.


the most likely strategy here IMO is to leverage what existing OS's can already do, essentially treating any neural-net processors as another piece of hardware as far as the OS is concerned.

this would probably mean using neural-net processors along-side traditional CPU cores (rather than in place of them).


better would be to try for a strategy where the merits of both can be
gained, and as many limitations as possible can be avoided.

most likely, this would be via a hybrid model.
Absolutely. Hybrid at many scales, down to analog computation for neurons.

yeah.

analog is an idea I had not personally considered.

I guess a mystery here is how effectively semiconductor logic (in integrated circuits) can work with analog signals.

the main alternative is, of course, 8 or 16-bit digital signals.


I had more imagined as hybrids of neural-nets and traditional software.

granted, there is always a certain risk that it could mean, in some distant-future setting, people sitting around with Windows running in their heads (say, because they have incorporated processors and silicon neural-nets into their otherwise wetware brains, that or parts, or much, or all, of their brain functionality has been migrated into silicon).


or such...


_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to