On Tue, Sep 18, 2012 at 7:31 AM, Craig Weinberg <whatsons...@gmail.com>wrote:

>
>
> On Tuesday, September 18, 2012 1:50:47 AM UTC-4, Jason wrote:
>
>>
>>
>> On Mon, Sep 17, 2012 at 6:10 PM, Craig Weinberg <whats...@gmail.com>wrote:
>>
>>> I think that comp is almost true, except for when applied to
>>> consciousness itself, in which case it is exactly false. I wasn't asserting
>>> it so much as I was illustrating exactly why that is the case. Does anyone
>>> have any common sense analogy or story which makes sense of comp as a
>>> generator of consciousness?
>>>
>>
>> Craig,
>>
>> I'll give this a shot.
>>
>> Imagine there is a life form with only the most simple form of qualia.
>>  It can only experience two states of being: pain and the absence of pain.
>>
>> Further, let's say this creature has, say 10 semi-independent regions in
>> its brain, each responsible for different functions but also each is
>> connected to every other, to varying degrees.  Each can affect any other
>> region in various ways.
>>
>> When the creature is in a state of pain, each of the 10 regions of the
>> brain are notified of this state.  (This is communicated from the
>> creature's pain receptors to all other parts of its brain).
>>
>> The awareness of this state has different effects on each region, and the
>> regions in turn affect the creature's thoughts and behaviors.  For example,
>> one region begins telling the other regions of the brain to do whatever
>> they can to make it stop.  Another region expresses the associated
>> behaviors and thoughts that pertain to stress and anxiety.  A third region
>> of the brain might increase the readiness or propensity to flee, hide, cry
>> for help, or scream.  The states of the various regions have cascading and
>> circular affects on other regions, and the entire focus of the brain may
>> quickly shift (from what it was thinking before) to the single subject and
>> pursuit of ending the pain.  Taken to the extreme, this effect might become
>> all-encompassing, or even debilitating.
>>
>> In the above example, the perception of pain is described in terms of
>> information and the effect that information has on the internal states of
>> processes in the brain. The presence of the information, indicating pain,
>> is through a very complex process, interpreted in numerous ways by
>> different sub-agents in the brain to yield all the effects normally
>> associated with the experience.
>>
>> Jason
>>
>> P.S.
>>
>> Try this little experiment from your own home: close your eyes and slowly
>> begin to pinch the skin on the back of your hand.  Pay particular attention
>> to the feeling as it crosses the threshold from mere feeling into pain.
>>  Concentrate on what it is that is different between that perception (of
>> the light pinch) and the pain (of the string pinch).  You may find that it
>> is just information, along with an increasing anxiety and desire to make it
>> stop.  Experiments have found that certain people with brain damage or on
>> certain drugs can experience the pain without the discomfort.  There is a
>> separate part of the brain responsible for making pain uncomfortable!
>>
>
> What you have then is 10 regions of the brain (are they self categorized?
> formally partitioned? who knows there are a such thing as brain regions
> besides us?)
>


Here is an example:


Functional MRI scans have indicated that an area of the brain, called
the *anterior
cingulate cortex*, processes pain information to determine how a person is
affected.  Severing the link to this part of the brain has a curious effect
on one's reaction to pain.  A condition known as *pain dissociation* is the
result.  Along with brain surgery such as lobotomy or cingulotomy, the
condition may also occur through the administration of certain drugs such
as morphine.  Those with pain dissociation still perceive pain; they are
aware of its location and intensity but pain is no longer unpleasant or
distressing.  Paul Brand, a surgeon and author on the subject of pain
recounted the case of a woman who had suffered with a severe and chronic
pain for more than a decade: She agreed to a surgery that would separate
the neural pathways between her frontal lobes and the rest of her
brain.  The surgery was a success.  Brand visited the woman a year later,
and inquired about her pain.  She said, “Oh, yes, its still there.  I just
don't worry about it anymore.”  With a smile she continued, “In fact, it's
still agonizing.  But I don't mind.”


The conclusion: even seemingly simple qualia, like pain are far from simple.

I think Marvin Minksy understands this well, and provides a good
explanation:

Marvin Minsky considers it to be “a huge mistake-that attempt to reify
'feeling' as an independent entity, with an essence that's indescribable.
As I see it, feelings are not strange alien things.  It is precisely those
cognitive changes themselves that constitute what 'hurting' is-and this
also includes all those clumsy attempts to represent and summarize those
changes.  The big mistake comes from looking for some single, simple,
'essence' of hurting, rather than recognizing that this is the word we use
for complex rearrangement of our disposition of resources.”


According to Minsky, human consciousness involves the interplay between as
many as 400 separate sub-organs of the brain.  One can imagine a symphony
of activity resulting from these individual regions, each acting on each
others' signals and in turn reacting to how those other regions are then
affected, in a kind of perpetual and intertwined feedback loop of enormous
complexity.



There are centers of the brain for sight, touch, language, hearing,
drawing, pain, etc.  They are all in some (or many) ways connected to each
other.  See this for more information:
http://en.wikipedia.org/wiki/Modularity_of_mind


which have no experience or qualia whatsoever, yet can detect
> "notifications" of a presumably epiphenomenal "state" of  "pain".
>

Pain is anything but epiphenomenal.  The fact that someone is able to talk
about it rules out it being an epiphenomenon.


>
> If the brain is doing all of the work, why does the top level organism
> have some other worthless abstraction layer of "experience" when, as
> blindsight proves, we are perfectly capable of processing information
> without any conscious qualia at all.
>

It's not worthless at all.  Would you still be able to function if all you
knew were the raw firing data of the millions of photosensitive cells in
your retina?  No, it takes many layers of perception, detecting lines,
depth perception, motion, colors, objects, faces, etc. for the sense of
sight to be as useful as it is to us.  After the different layers process
this information and share it with the other brain regions, we lose the
ability to explain how it is we recognize a face, or how red differs from
green.  These determinations were done by a lower level module, and its
internal processing is not privy to other brain regions (such as the brain
region that talks), and so it remains mysterious.


>
> Information is very close to consciousness, but ultimately fails to
> sustain itself. The pixels on your screen have no way to detect each other
> or process the image that you see as a coherent gestalt, and the processor
> behind the graphics generation has no way to detect the visual end result,
> and if it did, it would be completely superfluous. Your graphics card does
> not need to see anything.
>

Of course the pixels don't process themselves.  You need a brain with
complex software and filters to make sense of the flood of photons entering
the eye.  And you need other regions of the brain to make sense of the
visual scene (to integrate it into an even larger context).


>
> To me it makes more sense to see information as nothing but the semiotic
> protocols developed by perceptual participation (experience) to elaborate
> and deepen the qualitative richness of those experiences.
>

I wish I did not have to struggle to translate your sentences so
frequently.  I completely failed on this one.


> Of course, the protocols which are maps of one level of experience are the
> territory of another, which is what makes it confusing to try to reverse
> engineer consciousness from such an incredibly complex example as a Homo
> sapien.
>

Definitely.  Our consciousness is not a simple thing, it involves hundreds
of billions of (literally) moving parts.


>
> Our pinch is a continuum of sensory, emotional, and cognitive interaction
> because we are made of the qualia of hundreds of billions of neurons
>

Okay.


> and billions of lifetimes of different species and substances.
>

I don't think the preceding life times or substances is relevant.  If your
duplicate were created randomly by some quantum fluctuation its brain would
create the same experience.


> That only means our pain can seem like information to us, not that all
> pain arises from information processing.
>

I think it is a worth making the distinction that it is the system (doing
the processing) that has the experience, not the information or the
processing of the information.  The information from the perspective of the
system, makes a difference to the system causing it to enter different
states.  The ability to differentiate is at the heart of what it is to
perceive.


> Information does not concretely exist as an independent entity.
>

"X" does not concretely exist as an independent entity.

Is there any term "X", where the above sentence does not hold, in your view?


> There are forms which can be used to inform if they are intentionally
> treated that way, as a map, but nothing is just a map by itself. Every map
> is A territory (not THE territory). being used by another 'territory' as a
> map.
>

Maybe all there is are maps?


> I might use a piece of paper with ink on it (a territory) as a map because
> the ink is printed in a pre-configured protocol which I can learn to read
> easily as part of the intended audience of the map, or which I can learn to
> read even if I wasn't intended as an audience. Logic circuits don't do
> that. They don't care about learning. They store the recordings of our
> intentions, and reproduce them in a trivial and mechanistic way.
>

Just like our DNA stores the recordings of evolution's intentions, and we
follow those instructions in a reproducible mechanistic way (I won't say
trivial because not all machines are simple, and the resulting behaviors of
machines can be anything but trivial).

Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to