On Fri, May 29, 2015 at 10:34 AM, Bruno Marchal <marc...@ulb.ac.be> wrote:

>
> On 28 May 2015, at 20:12, Terren Suydam wrote:
>
> On Thu, May 28, 2015 at 4:20 AM, Bruno Marchal <marc...@ulb.ac.be> wrote:
>
>>
>> On 28 May 2015, at 05:16, Terren Suydam wrote:
>>
>> Language starts to get in the way here, but what you're suggesting is
>> akin to someone who is blind-drunk - they will have no memory of their
>> experience, but I think most would say a blind-drunk is conscious.
>>
>> But I think the driving scenario is different in that my conscious
>> attention is elsewhere... there's competition for the resource of
>> attention. I don't really think I'm conscious of the feeling of the floor
>> pressing my feet until I pay attention to it.
>>
>> My thinking on this is that human consciousness involves a unified/global
>> dynamic, and the unifying thread is the self-model or ego. This allows for
>> top-down control of attention. When parts of the sensorium (and other
>> aspects of the mind) are not involved or included in this global dynamic,
>> there is a significant sense in which it does not participate in that human
>> consciousness. This is not to say that there is no other consciousness -
>> just that it is perhaps of a lower form in a hierarchy of consciousness.
>>
>> I would highlight that human consciousness is somewhat unique in that the
>> ego - a cultural innovation dependent on the development of language - is
>> not present in animals. Without that unifying thread of ego, I suggest that
>> animal consciousness is not unlike our dream consciousness, which is an
>> arena of awareness when the thread of our ego dissolves. A visual I have is
>> that in the waking state, the ego is a bag that encapsulates all the parts
>> that make up our psyche. In dreamtime, the drawstring on the bag loosens
>> and the parts float out, and get activated according to whatever seemingly
>> random processes that constitute dreams.
>>
>> In lucid dreams, the ego is restored (i.e. we say to ourselves, "*I* *am*
>> dreaming") - and we "regain" consciousness.
>>
>>
>> We regain the ego (perhaps the ego illusion), but as you say yourself
>> above, we are conscious in the non-lucid dream too. Lucidity might be a
>> relative notion, as we can never be sure to be awaken. The false-awakening,
>> very frequent for people trained in lucid dreaming, illustrate somehow this
>> phenomena.
>>
>
> Right. My point is not that we aren't conscious in non-lucid dream states,
> but that there is a qualitative difference in consciousness between those
> two states, and that lucid-dream consciousness is much closer to waking
> consciousness than to dream consciousness, almost by definition. It's this
> fact I'm trying to explain by proposing the role of the ego in human
> consciousness.
>
>
> OK. usually I make that difference between simple universality (conscious,
> but not necessarily self-conscious), and Löbianity (self-conscious). It is
> the difference between Robinson Arithmetic and Peano Arithmetic (= RA + the
> induction axioms).
>
> It is an open problem for me if RA is more or less conscious than PA. PA
> has much stronger cognitive abilities, but this can filter more
> consciousness and leads to more delusion, notably that "ego".
>
> I don't insist too much on this, as I am not yet quite  sure. It leads to
> the idea that brains filter consciousness, by hallucinating the person.
>
>
I'm not so sure that "filtering" is the best analogy, by itself anyway. No
doubt that there is filtering going on, but I think the forms constructed
by the brain may also have a *transforming *or *focusing* effect as well.
It may not the case, in other words, that consciousness is merely,
destructively, filtered by our egos, but there is a sense too in which the
consciousness we experience is made "sharper" by virtue of being shaped or
transformed, particularly by this adaptation of reifying the self-model.

I make this remark because most of the time I use "consciousness" in its
>> rough general sense, in which animals, dreamers, ... are conscious.
>>
>
> Of course... my points are about what kinds of aspects of being human
> might privilege our consciousness, in an attempt to understand
> consciousness better.
>
>
> OK. I understand.
>
>
>
>> Then, I am not sure higher mammals have not yet already some ego, and
>> self-consciousness, well before language. Language just put the ego in
>> evidence, and that allows further reflexive loops, which can lead to
>> further illusions and "soul falling situation".
>>
>
> Right, one could argue that even insects have some kind of self-model.
> There is no doubt a spectrum of sophistication of self-models, but I would
> distinguish all of them from the human ego. I guess I was too quick before
> when I equated the two. The key distinction between a self-model and an ego
> is the ability to refer to oneself as an object - this, and the ability to
> *identify* with that object, reifies the self model in a way that appears
> to me to be crucial to human consciousness. I don't think this is really
> possible without language.
>
>
> Probably. But that identification is already a sort of  "illusion". It is
> very useful in practice, to survive, when being alive. But the truth,
> including possible afterlives is more complex.
>
>
I think the word illusion in this context adds more confusion than clarity.
Yes, there is definitely a sense in which the self is an illusion, in that
it is constructed. In this sense though, everything we experience is an
illusion. While that may be true, casting the ego as an illusion misses the
important idea that the self produces a form of self-reflective
consciousness that is not available otherwise.

It would be like saying that bats' echolocation is an illusion. Not a
perfect analogy, because a bat's facility for echolocation is rooted in its
physiology, not constructed, but the point is that with both the ego and
echolocation, the experiencer's consciousness is provided with a particular
character it would not otherwise have been able to experience.

Terren

>
>
>> Nor am I sure that our ego dissolves in non-lucid dream, although it
>> seems to disappear in the non-REM dreams, and other sleep states.
>>
>
> For me, the key insight I had in trying to describe the difference between
> lucid and non-lucid dreams is the ability to say "I am dreaming", which is
> an ego statement. What other explanations could account for the difference
> between lucid and non-lucid dreams?
>
>
> No problem with this. I made this remark only because I know people who
> confuse awakeness and consciousness. Some, like Malcolm deny any
> consciousness in the sleeping state (nor in any machine).
>
> Note that we can know that we are dreaming, but we can never know "for
> sure" that we are awake, and indeed with comp, the QM weirdness can be
> interpreted as symptoms that we belong to a collective, first person
> sharable, sort of dream.
>

> Bruno
>
>
>
>
>
>
> Terren
>
>
>>
>> Bruno
>>
>>
>>
>> Terren
>>
>>
>>
>>
>>
>>
>>
>> On Wed, May 27, 2015 at 10:10 PM, Jason Resch <jasonre...@gmail.com>
>> wrote:
>>
>>> Are we any less conscious of as it happens, or perhaps our brains are
>>> simply not forming as many memories of usual/uneventful tasks.
>>>
>>> Jason
>>>
>>>
>>> On Wed, May 27, 2015 at 9:06 PM, Terren Suydam <terren.suy...@gmail.com>
>>> wrote:
>>>
>>>> In the driving scenario it is clear that computation is involved,
>>>> because all sorts of contingent things can be going on (e.g. dynamics of
>>>> driving among other cars), yet this occurs without crossing the threshold
>>>> of consciousness. Relying on some kind of caching mechanism under such
>>>> circumstances would quickly fail one way or another.
>>>>
>>>> Terren
>>>> On May 27, 2015 7:38 PM, "Pierz" <pier...@gmail.com> wrote:
>>>>
>>>>>
>>>>>
>>>>> On Thursday, May 28, 2015 at 6:06:22 AM UTC+10, Brent wrote:
>>>>>>
>>>>>>  On 5/26/2015 10:31 PM, Pierz wrote:
>>>>>>
>>>>>>   Where I see lookup tables fail is that they seem to operate above
>>>>>>> the probable necessary substation level. (Despite having the same
>>>>>>> inputs/outputs at the higher levels).
>>>>>>>
>>>>>>>    But your memoization example still makes a good point - namely
>>>>>> that some computations can be bypassed in favour of recordings, yet
>>>>>> presumably this doesn't lead to fading qualia. We don't need anything as
>>>>>> silly as a gigantic lookup table of all possible responses. We only need 
>>>>>> to
>>>>>> acknowledge that we can store the results of recordings of computations
>>>>>> we've already completed, and that this should not result in any strange
>>>>>> degradation of consciousness.
>>>>>>
>>>>>>
>>>>>> Isn't that what allows me to drive home from work without being
>>>>>> conscious of it?
>>>>>>
>>>>>
>>>>> People keep making this point, which is one that I myself made in the
>>>>> past - and I believe you argued with me at the time, saying that it's not
>>>>> clear that the mechanism for automating brain functions is anything like
>>>>> the same as caching the results of a computation. I think that objection 
>>>>> is
>>>>> actually fair enough. With automated actions it's not clear that the
>>>>> computations aren't being carried out any more, just that they no longer
>>>>> require conscious attention because the neuronal pathways for those
>>>>> computations have become sufficiently reinforced that they no longer
>>>>> require concentration. I think this model (automated computation rather
>>>>> than cached computation) fits our experience of this phenomenon. Sometimes
>>>>> I suspect we're really talking out of our proverbial arses with these
>>>>> speculations as we still have so little idea about how the brain works. It
>>>>> may be a computer in the sense that it is Turing emulable, but then we 
>>>>> talk
>>>>> as if it were squishy laptop or something, and that analogy can be
>>>>> misleading in many ways. For example, our memories are nothing like RAM.
>>>>> They are distributed like a hologram, constructive and fuzzy, whereas
>>>>> computer memory is localised, passive and accurate to the bit. I'm 
>>>>> probably
>>>>> guilty of the same over-zealous computationalism with my lookup table
>>>>> analogy above, but I was thinking more of an AI and the in-principle point
>>>>> that cached computation results may be employed at a fine grained level. I
>>>>> would continue to insist that it is meaningless to say that a "brain" that
>>>>> employs cached results of computations is a zombie to the extent that it
>>>>> does so, because it is meaningless to speak of the "when" of qualia. (You
>>>>> never replied to my argument about poking a recorded Einstein with a 
>>>>> stick,
>>>>> which I think makes a compelling case for this.) We have to rigorously
>>>>> divide the subjective and the objective.
>>>>>
>>>>>>
>>>>>> Brent
>>>>>>
>>>>>
>>>>> --
>>>>> You received this message because you are subscribed to the Google
>>>>> Groups "Everything List" group.
>>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>>> an email to everything-list+unsubscr...@googlegroups.com.
>>>>> To post to this group, send email to everything-list@googlegroups.com.
>>>>> Visit this group at http://groups.google.com/group/everything-list.
>>>>> For more options, visit https://groups.google.com/d/optout.
>>>>>
>>>>
>>>> --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "Everything List" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to everything-list+unsubscr...@googlegroups.com.
>>>> To post to this group, send email to everything-list@googlegroups.com.
>>>> Visit this group at http://groups.google.com/group/everything-list.
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>
>>> --
>>> You received this message because you are subscribed to the Google
>>> Groups "Everything List" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to everything-list+unsubscr...@googlegroups.com.
>>> To post to this group, send email to everything-list@googlegroups.com.
>>> Visit this group at http://groups.google.com/group/everything-list.
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to everything-list+unsubscr...@googlegroups.com.
>> To post to this group, send email to everything-list@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>>
>> http://iridia.ulb.ac.be/~marchal/
>>
>>
>>
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to everything-list+unsubscr...@googlegroups.com.
>> To post to this group, send email to everything-list@googlegroups.com.
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
>  --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to