On 28 Jan 2014, at 13:36, Craig Weinberg wrote:



On Tuesday, January 28, 2014 5:23:02 AM UTC-5, Bruno Marchal wrote:

On 27 Jan 2014, at 22:22, Craig Weinberg wrote:



On Monday, January 27, 2014 5:57:55 AM UTC-5, Bruno Marchal wrote:

On 27 Jan 2014, at 06:07, Craig Weinberg wrote:



On Saturday, January 25, 2014 11:36:11 PM UTC-5, stathisp wrote:


On 26 January 2014 01:35, Craig Weinberg <whatsons...@gmail.com> wrote:

>> But that doesn't answer the question: do you think (or understand, or
>> whatever you think the appropriate term is) that the Chinese Room
>> COULD POSSIBLY be conscious or do you think that it COULD NOT POSSIBLY
>> be conscious?
>
>
> NO ROOM CAN BE CONSCIOUS. NO BODY CAN BE CONSCIOUS. NO FORM CAN BE
> CONSCIOUS.*
>
> *Except within the fictional narrative of a conscious experience. Puppets > can seem conscious. Doors, door-knobs, and Chinese rooms can SEEM to be
> conscious.

Do you think Barack Obama is conscious? If you do, then in whatever sense you understand that, can the Chinese Room also be conscious? Or do you think that is impossible?

Yes, I think that Barack Obama is conscious, because he is different from a building or machine. Buildings and machines cannot be conscious, just as pictures of people drinking pictures of water do no experience relief from thirst.

To compare a brain with a machine can make sense.
To compare a brain with a picture cannot.

It depends what the picture is doing. If you have a collection of detailed pictures of brains, and you organize them so that they are shown in different sequences according to some computation, isn't that a simulation of a brain?

It is not. It is a description of a computation, not a computation. The computation is in the logical relation, which includes the counterfactuals.

But the counterfactuals are theoretical rather than realistic.

I will no more comment any statements using word like "real", "realistic", "concrete", etc.



The computation is like an Escher drawing, it can do things that would be impossible for a real brain and cannot do or be real in ways that a brain must necessarily be. A picture is just the next step in abstraction toward the sub-theoretical, but it is actually one step more concrete in aesthetic realism. A real picture of a triangle is closer to consciousness than a computation for the Mandelbot Set, which is only a theory until it is presented graphically to a visual participant.

Now, we do describe computation by some description, and so this confusion is frequent. But it is the same type of confusion between ciphers and numbers. Ciphers and sequence of ciphers are not numbers. It is the cionfusion between "345" and 345.

Both "345" and 345 are still pictures.

?

I ask myself if you get the notion of number.


They can only be made meaningful when they are associated by a sensory experience in which some aesthetic content or expectation can be labelled with a string or value.

What can I say? That follows from your theory. But your theory does not even try to explain the sensory experience. You assume the difficulty which I think computer science explains partially, and in a testable way. The existence of your theory is not by itself a refutation of a different theory.







In either case, consciousness makes no more sense as part of a brain or a machine than a picture.

Right. We agree on that. But a brain can locally manifest a person.

I don't think it can.

So, if someone lost his body in some accident, but the rescuer saves the brain, and succeeded in connecting it to an artificial heart, and eventually an artificial body (but still with his natural brain). The guy behaves normally. He kept his job. But you tell me that he has become a zombie?



A tip cannot locally manifest an iceberg. A cookie cutter cannot manifest a cookie.

A picture cannot. You can't implement it in a computer, in the sense of implementing a program, which then can manifest a person.

Right, because nothing can manifest a person except the complete history of experiences of Homo sapiens.

You confirm that you are lowering the level, and in fact up to infinity.

It looks like saying "I am an infinite being".
I have no local Gödel number that you can put on some hard disk.

It is your right, but, well, I am not interested in that type of theory.

It excludes too much possibilities, and is based on some illusion of superiority. The 1p of the machine also believes, even know, its relation with *infinity*, but the correct machine does not brag on this, and still less, derived any superiority feeling from this.

(especially that comp explains in which sense the machine is right when saying that about herself (her 1-self).





Machines are like 4D pictures. One picture or form leads to another and another, and if there were some interpreter they could infer a logic to those transitions, but there is nothing in the machine which would itself lead from unconsciousness to awareness.

No, but the machine can still enact it.

What the machine enacts is an impersonal performance of personhood, not a person.


Well, at least that's a nice way to define what do a zombie: an impersonal performance of personhood.

A (3-1) person without a (1-1) person.

Hmm...

Bruno




Craig


Bruno

http://iridia.ulb.ac.be/~marchal/




--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to