On 04 Sep 2015, at 20:26, meekerdb wrote:

On 9/4/2015 7:35 AM, Bruno Marchal wrote:

On 03 Sep 2015, at 20:26, meekerdb wrote:

On 9/3/2015 8:35 AM, Bruno Marchal wrote:

On 02 Sep 2015, at 22:48, meekerdb wrote:

On 9/2/2015 8:25 AM, Bruno Marchal wrote:
So now you agree with me that there are different kinds and degrees of consciousness; that it is not just a binary attribute of an axiom + inference system.

?

Either you are conscious, or you are not.

But is a roundworm either conscious or not?  an amoeba?

I don't know, but i think they are. Even bacteria, and perhaps even some viruses, but on a different time scale than us.



If they can be conscious, but not self-conscious then there are two kinds of "being conscious".

Yes, at least two kinds, but each arithmetical hypostases having either "<>t" or "& p" describes a type of consciousness, I would say. And they all differentiate on the infinitely many version of "[]A", be it the "[]" predicate of PA, ZF, an amoeba or you and me ...

So if there are different kinds of consciousness then a being with more kinds is more conscious. It seems that your dictum, "Your either conscious or not." is being diluted away to mere slogan.


There are basically two levels, without criterion of decidability, but with simple operational definition:

1) something is conscious if it is torturable, and arguably ethically wrong of doing so.

So when Capt Sequra tells Wormold that he's "not of the torturable class" he means he's not conscious. :-)

You might need to give some references here, I'm afraid.


It's from "Our Man In Havana" by Grahame Green. Only poor Cubans are in the torturable class, not Englishmen.

So the englishmen are not conscious, which might explain some thing...

Well, of course by "torturable" I meant "judged as being able to feel pain", not "judged as not being able to feel plain", like some people believed it is the case for animals.







How is this an operational defintion? What is the operation to determine whether a being is torturable?

Yu make the torture publicly, and if you are sent to jail, the entity is conscious, at least in the 3-1 view of the people you are living with.

You mean the people who sent me to jail are conscious, i.e. they have empathy which implies they are conscious. But that doesn't really solve the problem. They might just be pretending empathy.

If they pretend empathy, they are self-conscious, even if you are not. Pretending and lying requires self-consciousness.




And it doesn't help with my design of a Mars Rover. Will it be conscious only if I program it to show empathy when another Mars Rover is tortured? Does a jumping spider show empathy when a fly is tortured, or only when another jumping spider is tortured?

I don't know. Not sure spider (even jumping spider) have a lot of empathy.

In matter of consciousness, there are no definite criteria, and the operational or quasi-operational criterium I am suggesting is to give an idea of a sufficient condition to attribute consciousness, certainly not to NOT attribute consciousness. Ethically, it is better to attribute consciousness wrongly that to attribute absence of consciousness wrongly.









I think all invertebrates are already at that level, and in arithmetic that might correspond to the sigma_1 complete (Turing universality). Robinson Arithmetic, the universal dovetailer, are at that level.

2) something is self-conscious if it is Löbian, basically he is aware of its unnameable name. PA, ZF, are "at that level", like all their sound recursively enumerable extensions. At that level, the entity is able to ascribe consciousness to another, and can get the the moral understanding of good and wrong (with or without a forbidden fruit).

What's the operation to determine it is aware of its unamable name?

Ok, you torture a fellow, now, and all people complaining about this can be said to have the ability to ascribe consciousness to others.

In principle you have to repeat this often to avoid the partial zombie case. The criteria are operational in the weak sense of making the statement plausible, as we know already that there is no definite criterion for consciousness. We might not been able to convince an alien about this.

Essentially you are saying just rely on your intuition about what's conscious and what's not. But as Scott Aaronson point out we seek a theory of consciousness that we can apply to machines and aliens where our intuition doesn't work.

But this was already given. My current theoretical attribution is simple: Turing universality is enough for consciousness, and Löbianity ("awareness of one own Turing universality) is enough for self- consciousness. But there are no mechanical criterion to recognize Turing universality, nor any program actually (cf Rice Theorem). There are no practical criteria. In practice, only living with some being and using our intuition can be used to "recognize" oneself in another (which is the same as attributing consciousness to another, somehow). May be you ask something impossible ...

Bruno






Brent

--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to