On 22 Sep 2014, at 03:22, meekerdb wrote:

On 9/21/2014 5:07 PM, Telmo Menezes wrote:


On Mon, Sep 22, 2014 at 1:34 AM, LizR <lizj...@gmail.com> wrote:
Good point Brent and one on which I am also equivocal, which is why I have been keen to tease out whether people are talking about consciousness or the contents of consciousness, and to try to work out whether there is, in fact, any difference. If there isn't, consciousness becomes something like elan vital, a supposed magic extra that isn't in fact necessary in explanatory terms - all that exists are "bundles of sensations" (or however Hume phrased it).

But in materialism we still have a magic extra: matter itself. In the MUH math is the magic extra. I don't know of any theory that gets rid of all "magic" assumptions.

True. But matter explains lots of other stuff. Consciousness as a pure potentiality, distinct from any content, doesn't explain anything.

Right. Those things have to be explained in simpler theory, like comp forces us to use arithmetic or anything Turing equivalent to arithmetic.

But I thought no one (except Craig and some others) suggest that consciousness is a potentiality, although it might be related to it, (through <>p, indeed), and no more is suggesting this explains everything.






In reply to John's comment, we don't know that sure that certain types of brain activity cause consciousness, that's a (very reasonable) hypothesis based on the fact the two appear to be always correlated.

We don't even know if they are strongly correlated, because we don't know what else is conscious.

And we don't know that other people are conscious. But as JKC pointed out we do know that things that affect our brain affect our consciousness. Quite aside from anesthesia and concussions that make it go away (modulo your theory that we merely forget), it's affected by whiskey and pot and salvia and LSD, and the effects are even amenable to some explanation at the molecular level.

But that explanation is partially correct, and unprecise, as it use the 1-1 identity, and not the many-1 identity needed in Everett and/or in computationalism.






Is an insect swarm conscious? Is your computer? Are galaxies? The problem is that we might be confusing empathy for consciousness. It is clear that the more an organism is similar to us the more empathy we feel (human > monkey > cat > insect > bacteria, ...).

That's true on Bruno's definition of consciousness. But that's not the consciousness that we are told is indubitable and which we all intuititively know we have.

? I don't see why you say that. I have no definition of consciousness, except that I explain a bit of it when explaining why comp solves the hard part of the problem, by explaining why machine get the hard question and realize it cannot be explained at their level (they need proposition in their own G*).




We attribute consciousness to other things as we perceive their behavior to be intelligent and goal directed; because that's how we recognize it in people: "How many fingers do you see?" "What day is it?" "Do you know where you are?".

OK.

Bruno




Brent


--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to