On 04 Feb 2015, at 21:08, meekerdb wrote:

On 2/4/2015 11:40 AM, Bruno Marchal wrote:

On 03 Feb 2015, at 20:40, meekerdb wrote:

On 2/3/2015 11:13 AM, Jason Resch wrote:
I agree with John. If consciousness had no third-person observable effects, it would be an epiphenomenon. And then there is no way to explain why we're even having this discussion about consciousness.

I'm not arguing that it has no observable effects. JKC says it's necessary for intelligence. I'm arguing that might have been necessary for for the evolution of intelligence starting from say fish. But that doesn't entail that is necessary for any intelligent system.

It is not necessary for any competent system, but intelligence is not competence, it is more like an understanding of our own incompetence, an ability to learn, notably through errors and "dreams".

Why isn't learning just a matter of increasing competence based on experience? I don't see that learning is any different that other competences.

To simplify, you can consider a competence as an ability to follow a program P_i or to compute the corresponding partial or total function phi_i. Learning can then be described as the inverse: finding i (or j ...) when you are presented with a sample (perhaps infinitely growing) of input and output of phi_i (= phi_j, ...).

There is a general theory of such learning (for both machine and non- machine). There is no universal learner (Putnam), unless you weaken very strongly the criteria of identification of phi_i. If you allow unbounded finite errors, infinite changes of mind, including changes of mind when the learning machine has already find a correct program (modulo the finite number of errors), then you have a (very) theoretical notion of universal learner (Case & Smith), but it can be shown necessarily impractical (Harrington). So learning is usually considered as a different kind of competence, even if it will be itself related to the code of an inference inductive machine, and so is a form of competence too. But intelligence is not a competence per se: it is the natural state of all universal machine, before someone installs Windows or Christian Dogma, or ...










If we build computers that discuss and question their own consciousness and qualia I'd consider that proof enough that they are.

But is that the standard of intelligence? JKC argues intelligence=>consciousness. What if they discuss and question their own consciousness, but say stupid things about it?

That's what do intelligent systems: they say stupid things.
Intelligence just add the interrogation sign '?" behind them. It is harm reduction for everybody. It helps for the next change of mind.



The bigger question, is what machines might be conscious yet unable to talk about, reflect upon, or signal to us that they are in fact conscious? This requires a theory of consciousness.

Exactly. That is my concern. Suppose we build an autonomous Mars Rover to do research. We give it learning ability, so it must reflect on its experience and act intelligently. Have we made a conscious being? Contrary to Bruno, I think there are kinds and degrees of consciousness - just as there are kinds and degrees of intelligence.

It will be conscious at the place where it confuses itself with the (relatively real) environment. OK. It depends also on its abilities, and you can make it self-conscious by adding enough induction axioms. Don't put to much induction axioms, as Mars Rover will get stuck in self dialog about its consciousness and how to convince those self-called [censored] humans!

So without the to-many induction axioms it will be conscious, but not self-conscious.

Yes, but I have still difficulties on this (not to mention my dialog with salvia which makes things more weird and counter-intuitive. I feel I still miss something. I might be unconsciously restrained by some prejudices and probably don't yet push the logic of comp far enough).



Thus you agree that consciousness is not all-or-nothing.

"Being conscious" is all-or-nothing, but the content, intensity and atmosphere can vary greatly.

I identify "being conscious" with "there is consciousness", and this is binary, as there is no degree of unconsciousness (Quentin). But some state of consciousness, (like in sleep), remembered in other state of consciousness (like after awaken) can seem (and perhaps be) less intense, blurred, fuzzy, etc.

It is generally very difficult to compare a first person experience with another one, including our own. The intimacy of that state precludes any direct comparison. Memories are eliminated, colors can be added, ... We live a constant tyranny of the here-and-now present state, which can embellish or impoverish the re-enacting of a past experience.

Bruno


Brent







--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to