--- "John G. Rose" <[EMAIL PROTECTED]> wrote:

> > From: Matt Mahoney [mailto:[EMAIL PROTECTED]
> > 
> > By "equivalent computation" I mean one whose behavior is
> > indistinguishable
> > from the brain, not an approximation.  I don't believe that an exact
> > simulation requires copying the implementation down to the neuron level,
> > much
> > less the molecular level.
> > 
> 
> So how would you approach constructing such a model? I suppose a superset
> intelligence structure could analyze properties and behaviors of a brain and
> simulate it within itself. If it absorbed enough data it could reconstruct
> and eventually come up with something close.

Well, nobody has solved the AI problem, much less the uploading problem. 
Consider the problem in stages:

1. The Turing test.

2. The "personalized" Turing test.  The machine pretends to be you and the
judges are people who know you well.

3. The "planned, personalized" Turing test.  You are allowed to communicate
with judges in advance, for example, to agree on a password.

4. The "embodied, planned, personalized" Turing test.  Communication is not
restricted to text.  The machine is planted in the skull of your clone.  Your
friends and relatives have to decide who has the carbon-based brain.

Level 4 should not require simulating every neuron and synapse.  Without the
constraints of slow, noisy neurons, we could use other algorithms.  For
example, low level visual processing such as edge and line detection would not
need to be implemented as a 2-D array of identical filters.  It could be
implemented serially by scanning the retinal image with a window filter.  Fine
motor control would not need to be implemented by combining thousands of
pulsing motor neurons to get a smooth average signal.  The signal could be
computed numerically.  The brain has about 10^15 synapses, so a
straightforward simulation at the neural level would require 10^15 bits of
memory.  But cognitive tests suggest humans have only about 10^9 bits of long
term memory, suggesting that more compressed representation is possible.

In any case, level 1 should be sufficient to argue convincingly that either
consciousness can exist in machines, or that it doesn't in humans.


-- Matt Mahoney, [EMAIL PROTECTED]

-------------------------------------------
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=4007604&id_secret=96140713-a54b2b
Powered by Listbox: http://www.listbox.com

Reply via email to