Re: Apperception or self-awarewess

2012-08-15 Thread Bruno Marchal


On 14 Aug 2012, at 18:11, Roger wrote:


Hi Bruno Marchal

For what it's worth, Leibniz differentiated between ordinary  
perception
(which would include sentience or awareness) and self-awareness,  
which he called

apperception.


That difference is well approximated or quasi-explained by the  
difference between Universality, and what I call to be short  
Löbianity. Universal machine might be conscious, and Löbian machine  
are self-conscious. They have just one reflexive loop more, and it can  
be shown that you cannot add a nex reflexive loop to make them  
different. It is basically the difference between a simple first order  
specification of a universal machine, and the same + some induction  
axiom. It is the difference between Robinson Arithmetic (successor,  
addition + multiplication axioms and rules) and Peano Arithmetic (the  
same as Robinson + the shema of induction axioms).
The induction axioms makes possible to the self to prove its own  
Löbianity, and to give to the machine a sort of maximal self- 
referential ability (well studied in mathematical logic, but not so  
much well known, apart from logicians).


Bruno







Roger , rclo...@verizon.net
8/14/2012
- Receiving the following content -
From: Bruno Marchal
Receiver: everything-list
Time: 2012-08-12, 04:15:11
Subject: Re: Libet's experimental result re-evaluated!

On 11 Aug 2012, at 01:57, Russell Standish wrote:

 On Fri, Aug 10, 2012 at 09:36:22AM -0700, meekerdb wrote:
 But a course of action could be 'selected', i.e. acted upon,  
without

 consciousness (in fact I often do so). I think what constitutes
 consciousness is making up a narrative about what is 'selected'.

 Absolutely!

 The evolutionary reason for making up this narrative is to enter it
 into memory so it can be explained to others and to yourself when
 you face a similar choice in the future.

 Maybe - I don't remember Dennett ever making that point. More
 importantly, its hard to see what the necessity of the narrative is
 for forming memories. Quite primitive organisms form memories, yet  
I'm

 sceptical they have any form of internal narrative.

 That the memory of these
 past decisions took the form of a narrative derives from the fact
 that we are a social species, as explained by Julian Jaynes. This
 explains why the narrative is sometimes false, and when the part of
 the brain creating the narrative doesn't have access to the part
 deciding, as in some split brain experiments, the narrative is just
 confabulated. I find Dennett's modular brain idea very plausible
 and it's consistent with the idea that consciousness is the  
function

 of a module that produces a narrative for memory. If were designing
 a robot which I intended to be conscious, that's how I would design
 it: With a module whose function was to produce a narrative of
 choices and their supporting reasons for a memory that would be
 accessed in support of future decisions. This then requires a
 certain coherence and consistency in robots decisions - what we  
call

 'character' in a person. I don't think that would make the robot
 necessarily conscious according to Bruno's critereon. But if it had
 to function as a social being, it would need a concept of 'self'  
and

 the ability for self-reflective reasoning. Then it would be
 conscious according to Bruno.

 Brent

 IIRC, Dennett talks about feedback connecting isolated modules (as  
in

 talking to oneself) as being the progenitor of self-awareness (and
 perhaps even consciousness itself). Since this requires language, it
 would imply evolutionary late consciousness.

 I do think that self-awareness is a trick that enables efficient
 modelling of other members of the same species. Its the ability to  
put

 yourself in the other's shoes, and predict what they're about to do.

 I'm in two minds about whether one can be conscious without also  
being

 self-aware.

I tend to think that consciousness is far more primitive than self-
consciousness. I find plausible that a worm can experience pain, but
it might not be self-aware or self-conscious.

Bruno


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.



--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 

Apperception or self-awarewess

2012-08-14 Thread Roger
Hi Bruno Marchal 

For what it's worth, Leibniz differentiated between ordinary perception
(which would include sentience or awareness) and self-awareness, which he called
apperception.




Roger , rclo...@verizon.net
8/14/2012 
- Receiving the following content - 
From: Bruno Marchal 
Receiver: everything-list 
Time: 2012-08-12, 04:15:11
Subject: Re: Libet's experimental result re-evaluated!


On 11 Aug 2012, at 01:57, Russell Standish wrote:

 On Fri, Aug 10, 2012 at 09:36:22AM -0700, meekerdb wrote:
 But a course of action could be 'selected', i.e. acted upon, without
 consciousness (in fact I often do so). I think what constitutes
 consciousness is making up a narrative about what is 'selected'.

 Absolutely!

 The evolutionary reason for making up this narrative is to enter it
 into memory so it can be explained to others and to yourself when
 you face a similar choice in the future.

 Maybe - I don't remember Dennett ever making that point. More
 importantly, its hard to see what the necessity of the narrative is
 for forming memories. Quite primitive organisms form memories, yet I'm
 sceptical they have any form of internal narrative.

 That the memory of these
 past decisions took the form of a narrative derives from the fact
 that we are a social species, as explained by Julian Jaynes. This
 explains why the narrative is sometimes false, and when the part of
 the brain creating the narrative doesn't have access to the part
 deciding, as in some split brain experiments, the narrative is just
 confabulated. I find Dennett's modular brain idea very plausible
 and it's consistent with the idea that consciousness is the function
 of a module that produces a narrative for memory. If were designing
 a robot which I intended to be conscious, that's how I would design
 it: With a module whose function was to produce a narrative of
 choices and their supporting reasons for a memory that would be
 accessed in support of future decisions. This then requires a
 certain coherence and consistency in robots decisions - what we call
 'character' in a person. I don't think that would make the robot
 necessarily conscious according to Bruno's critereon. But if it had
 to function as a social being, it would need a concept of 'self' and
 the ability for self-reflective reasoning. Then it would be
 conscious according to Bruno.

 Brent

 IIRC, Dennett talks about feedback connecting isolated modules (as in
 talking to oneself) as being the progenitor of self-awareness (and
 perhaps even consciousness itself). Since this requires language, it
 would imply evolutionary late consciousness.

 I do think that self-awareness is a trick that enables efficient
 modelling of other members of the same species. Its the ability to put
 yourself in the other's shoes, and predict what they're about to do.

 I'm in two minds about whether one can be conscious without also being
 self-aware.

I tend to think that consciousness is far more primitive than self- 
consciousness. I find plausible that a worm can experience pain, but 
it might not be self-aware or self-conscious.

Bruno


http://iridia.ulb.ac.be/~marchal/



-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.