On 26 January 2014 01:35, Craig Weinberg <whatsons...@gmail.com<javascript:;>>
wrote:

>> But that doesn't answer the question: do you think (or understand, or
>> whatever you think the appropriate term is) that the Chinese Room
>> COULD POSSIBLY be conscious or do you think that it COULD NOT POSSIBLY
>> be conscious?
>
>
> NO ROOM CAN BE CONSCIOUS. NO BODY CAN BE CONSCIOUS. NO FORM CAN BE
> CONSCIOUS.*
>
> *Except within the fictional narrative of a conscious experience. Puppets
> can seem conscious. Doors, door-knobs, and Chinese rooms can SEEM to be
> conscious.

Do you think Barack Obama is conscious? If you do, then in whatever sense
you understand that, can the Chinese Room also be conscious? Or do you
think that is impossible?

>> Or do you claim that the question is meaningless, a
>> category error (which ironically is a term beloved of positivists)? If
>> the latter, how is it that the question can be meaningfully asked
>> about humans but not the Chinese Room?
>
>
> Because humans are not human bodies. We don't have to doubt that humans
are
> conscious, as to do so would be to admit that we humans are the ones
> choosing to do the doubting and therefore are a priori certainly
conscious.
> Bodies do not deserve the benefit of the doubt, since they remain when we
> are personally unconscious or dear. That does not mean, however, that our
> body is not itself composed on lower and lower levels by microphenomenal
> experiences which only seem to us at the macro level to be forms and
> functions....they are forms and functions relative to our
> perceptual-relativistic distance from their level of description. Since
> there is no distance between our experience and ourselves, we experience
> ourselves in every way that it can be experienced without being outside of
> itself, and are therefore not limited to mathematical descriptions. The
sole
> purpose of mathematical descriptions are to generalize measurements - to
> make phenomena distant and quantified.

Wouldn't the Chinese Room also say the same things, i.e. "We Chinese Rooms
don't have to doubt that we are
conscious, as to do so would be to admit that we are the ones
choosing to do the doubting and therefore are a priori certainly conscious."

>> > I like my examples better than the Chinese Room, because they are
>> > simpler:
>> >
>> > 1. I can type a password based on the keystrokes instead of the letters
>> > on
>> > the keys. This way no part of the "system" needs to know the letters,
>> > indeed, they could be removed altogether, thereby showing that data
>> > processing does not require all of the qualia that can be associated
>> > with
>> > it, and therefore it follows that data processing does not necessarily
>> > produce any or all qualia.
>> >
>> > 2. The functional aspects of playing cards are unrelated to the suits,
>> > their
>> > colors, the pictures of the royal cards, and the participation of the
>> > players. No digital simulation of playing card games requires any
>> > aesthetic
>> > qualities to simulate any card game.
>> >
>> > 3. The difference between a game like chess and a sport like basketball
>> > is
>> > that in chess, the game has only to do with the difficulty for the
human
>> > intellect to compute all of the possibilities and prioritize them
>> > logically.
>> > Sports have strategy as well, but they differ fundamentally in that the
>> > real
>> > challenge of the game is the physical execution of the moves. A machine
>> > has
>> > no feeling so it can never participate meaningfully in a sport. It
>> > doesn't
>> > get tired or feel pain, it need not attempt to accomplish something
that
>> > it
>> > cannot accomplish, etc. If chess were a sport, completing each move
>> > would be
>> > subject to the possibility of failure and surprise, and the end can
>> > never
>> > result in checkmate, since there is always the chance of weaker pieces
>> > getting lucky and overpowering the strong. There is no Cinderella Story
>> > in
>> > real chess, the winning strategy always wins because there can be no
>> > difference between theory and reality in an information-theoretic
>> > universe.
>>
>> How can you start a sentence "a machine has no feeling so..." and
>> purport to discuss the question of whether a machine can have feeling?
>>
>> > So no, I do not "believe" this, I understand it. I do not think that
the
>> > Chinese Room is valid because wholes must be identical to their parts.
>> > The
>> > Chinese Room is valid because it can (if you let it) illustrate that
the
>> > difference between understanding and processing is a difference in kind
>> > rather than a difference in degree. Technically, it is a difference in
>> > kind
>> > going one way (from the quantitative to the qualitative) and a
>> > difference in
>> > degree going the other way. You can reduce a sport to a game (as in
>> > computer
>> > basketball) but you can't turn a video game into a sport unless you
>> > bring in
>> > hardware that is physical/aesthetic rather than programmatic. Which
>> > leads me
>> > to:
>>
>> The Chinese Room argument is valid if it follows that if the parts of
>> the system have no understanding then the system can have no
>> understanding.
>
>
> You aren't listening to me - which may not be your fault. Your
psychological
> specialization may not permit you to see any other possibility than the
> mereological argument that you keep turning to. Of course the whole can
have
> properties that the parts do not have, that is not what I am denying at
all.
> I am saying that there is no explanation of the Chinese Room which
requires
> that it understands anything except one in which understanding itself is
> smuggled in from the real world and attached to it arbitrarily on blind
> faith.

Then you don't consider the Chinese Room argument valid. You agree with the
conclusion and premises but you don't agree that the conclusion follows
from the premises in the way Searle claims.

>> It is pointed out (correctly) by Searle that the person
>> in the room does not understand Chinese, from which he CONCLUDES that
>> the room does not understand Chinese,
>
>
> Rooms don't understand anything. Rooms are walls with a roof. Walls and
> roofs are planed matter. Matter is bonded molecules. Molecules are sensory
> experiences frozen in some externalized perceptual gap.

The claim is that the consciousness of the room stands in relation to the
physical room as the consciousness of a person stands in relation to the
physical person.

>> and uses this conclusion to
>> support the idea that the difference between understanding and
>> processing is a difference in kind, so no matter how clever the
>> computer or how convincing its behaviour it will never have
>> understanding.
>
>
> The conclusion is just the same if you use the room as a whole instead of
> the person. You could have the book be a simulation of John Wayne talking
> instead. No matter how great the collection of John Wayne quotes, and how
> great a job the book does at imitating what John Wayne would say, the
> room/computer/simulation cannot ever become John Wayne.

It could not become John Wayne physically, and it could not become John
Wayne mentally if the actual matter in John Wayne is required to reproduce
John Wayne's mind, but you have not proved that the latter is the case.

>> I don't think your example with the typing is as good as the Chinese
>> Room, because by changing the keys around a bit it would be obvious
>> that there is no real understanding, while with the Chinese Room would
>> be able to pass any test that a Chinese speaker could pass.
>
>
> Tests are irrelevant, since the pass/fail standard can only be subjective.
> There can never be a Turing test or a Voigh-Kampff test which is
objective,
> but there will always be tests which designers of AI can use to identify
the
> signature of their design.

That's what Searle claims, which is why he makes the Room pass a Turing
test in Chinese and then purports to prove (invalidly, according to what
you've said) that despite passing the test it isn't conscious.

--
Stathis Papaioannou


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to