meekerdb wrote:

On 4/1/2015 10:42 PM, Bruce Kellett wrote:
Russell Standish wrote:
On Thu, Apr 02, 2015 at 02:48:47AM +0200, Platonist Guitar Cowboy wrote:
I still don't see what MGA "pumps intuitively and incorrectly", as you seem
to assume that MGA is "bad" intuition pump, rather than "good" one that
facilitates seeing something tricky. You've not shown that consciousness supervenes on broken gates, you don't treat movies like conscious entities, and haven't pointed towards a recording that is obviously or demonstrably
conscious.


It is one thing to argue intuitively that playing Casablanca does not
instantiate Humphrey Bogart's consciousness. That I would happily agree
with. It only involves a few 100KB per second. It is another thing to
argue that a precise recording of the firings of every neuron in
someone's brain similarly doesn't instantiate consciousness (at around
10^11 neurons per typical human brain, this would be something of the
order of 10^16 bytes per second). This is the sort of recording being
used in Maudlin's thought experiment/MGA. And obviously, according to
COMP, a huge lookup table encoding the machine's output for every
possible input for a machine implementing a conscious moment (which is
just another type of recording, albeit a very complex one that would
exceed the Seth LLoyd bound for the universe) must be conscious. Note
this latter type of device was used in Searles Chinese Room argument,
and I think needs to be answered the same way Dennett answers the
Chinese Room argument.

At some point on the complexity scale, recordings go from being not
conscious to conscious. Where do you draw the line? I'm afraid
intuition does not help much in this matter, which is why I say it is
a weakness of the MGA.

I agree about Humpy Bogart and films of that sort. But noone seriously argues that consciousness supervenes on the external visage -- or do they?

People's intuitions break down when faced with the compexity of 10^11 neurons and 10^16 bytes per second. I don't know what the data rates at the LHC are, but they reach the trillions of bytes. And they have all sorts of sophisticated fast electronic triggers to try and keep the data rate down to manageable levels.

For brains, I don't think there is any preset level of complexity at which consciousness kicks in. The average human adult with full functionality is really quite complicated. But a person can remain conscious with extensive brain damage, and depending on the type of damage, they can retain reasonable functionality. How mage damage from Alzheimer's before you lose consciousness? The minimal number of functional neurons might be quite low. That is why Brent is concerned that the Mars Rover might have an anxiety attack!

But data /*rate */can't be the right measure, since the same sequence of states measured against some other clock would be just as conscious. It must be complexity of brain processes as compared to some other processes; which is why I think the environment/context is an essential part of consciousness.

Yes, mention of data rates was irrelevant. Someone simulating brain processes by calculating with pencil and paper should not compromise consciousness if comp is correct, provided they capture sufficient details of the structure and processes.

But is relative complexity the measure? Relative to what? And I think one could be conscious in a fairly limited environment/context.

Bruce

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to