On Thu, Nov 25, 2010 at 7:40 PM, Jason Resch <jasonre...@gmail.com> wrote: > On Thu, Nov 25, 2010 at 3:38 PM, Rex Allen <rexallen31...@gmail.com> wrote: >> >> But I also deny that mechanism can account for consciousness (except >> by fiat declaration that it does). >> > > Rex, > I am interested in your reasoning against mechanism. Assume there is were > an] mechanical brain composed of mechanical neurons, that contained the same > information as a human brain, and processed it in the same way.
I started out as a functionalist/computationalist/mechanist but abandoned it - mainly because I don't think that "representation" will do all that you're asking it to do. For example, with mechanical or biological brains - while it seems entirely reasonable to me that the contents of my conscious experience can be represented by quarks and electrons arranged in particular ways, and that by changing the structure of this arrangement over time in the right way one could also represent how the contents of my experience changes over time. However, there is nothing in my conception of quarks or electrons (in particle or wave form) nor in my conception of arrangements and representation that would lead me to predict beforehand that such arrangements would give rise to anything like experiences of pain or anger or what it's like to see red. The same goes for more abstract substrates, like bits of information. What matters is not the bits, nor even the arrangements of bits per se, but rather what is represented by the bits. "Information" is just a catch-all term for "what is being represented". But, as you say, the same information can be represented in *many* different ways, and by many different bit-patterns. And, of course, any set of bits can be interpreted as representing any information. You just need the right "one-time pad" to XOR with the bits, and viola! The magic is all in the interpretation. None of it is in the bits. And interpretation requires an interpreter. SO...given that the bits are merely representations, it seems silly to me to say that just because you have the bits, you *also* have the thing they represent. Just because you have the bits that represent my conscious experience, doesn't mean that you have my conscious experience. Just because you manipulate the bits in a way as to represent "me seeing a pink elephant" doesn't mean that you've actually caused me, or any version of me, to experience seeing a pink elephant. All you've really done is had the experience of tweaking some bits and then had the experience of thinking to yourself: "hee hee hee, I just caused Rex to see a pink elephant..." Even if you have used some physical system (like a computer) that can be interpreted as executing an algorithm that manipulates bits that can be interpreted as representing me reacting to seeing a pink elephant ("Boy does he look surprised!"), this interpretation all happens within your conscious experience and has nothing to do with my conscious experience. Thinking that the "bit representation" captures my conscious experience is like thinking that a photograph captures my soul. Though, obviously this is as true of biological brains as of computers. But so be it. This is the line of thought that brought me to the idea that conscious experience is fundamental and uncaused. > The > behavior between these two brains is in all respects identical, since the > mechanical neurons react identically to their biological counterparts. > However for some unknown reason the computer has no inner life or conscious > experience. I agree that if you assume that representation "invokes" conscious experience, then the brain and the computer would both have to be equally conscious. But I don't make that assumption. So the problem becomes that once you open the door to the "multiple realizability" of representations then we can never know anything about our substrate. You *think* that your brain is the cause of your conscious experience...but as you say, a computer representation of you would think the same thing, but would be wrong. Given that there are an infinite number of ways that your information could be represented, how likely is it that your experience really is caused by a biological brain? Or even by a representation of a biological brain? Why not some alternate algorithm that results in the same *conscious* experiences, but with entirely different *unconscious* elements? How could you notice the difference? > Information can take many physical forms. Information requires interpretation. The magic isn't in the bits. The magic is in the interpreter. Rex -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-l...@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.