Brent Meeker wrote:
> > That notion may fit comfortably with your presumptive
> > ideas about 'memory' -- computer stored, special-neuron
> > stored, and similar. But the universe IS ITSELF 'memory
> > storage' from the start. Operational rules of performance
> > -- the laws of nature, so to
>
> Colin,
>
> You have described a way in which our perception may be more than can
> be explained by the sense data. However, how does this explain the
> response
> to novelty? I can come up with a plan or theory to deal with a novel
> situation
> if it is simply described to me. I don't have to
Colin Geoffrey Hales wrote:
> What I expect to happen is that the field configuration I find emerging in
> the guts of the chips will be different, depending on the object, even
> though the sensory measurement is identical. The different field
> configurations will correspond to the different o
Colin,
You have described a way in which our perception may be more than can
be explained by the sense data. However, how does this explain the response
to novelty? I can come up with a plan or theory to deal with a novel situation
if it is simply described to me. I don't have to actually per
James N Rose wrote:
>
>
> Brent Meeker wrote:
>
>> If consciousness is the creation of an inner narrative
>> to be stored in long-term memory then there are levels
>> of consciousness. The amoeba forms no memories and so
>> is not conscious at all. A dog forms memories and even
>> has some un
Stahis said:
> If you present an object with "identical sensory measurements" but get
different results in the chip, then that means what you took as "sensory
measurements" was incomplete. For example, blind people might be able to
sense the presense of someone who silently walks into the room du
Colin Hales writes:
>
> Stathis said
> <>
> > and Colin has said that he does not believe that philosophical zombies
> can exist.
> > Hence, he has to show not only that the computer model will lack the 1st
> person
> > experience, but also lack the 3rd person observable behaviour of the
>
Colin,
I think there is a logical contradiction here. You say that the physical models
do, in fact, explain the 3rd person observable behaviour of a physical system.
A brain is a physical system with 3rd person observable behaviour. Therefore,
the models *must* predict *all* of the third pers
Colin,
If there is nothing wrong with the equations, it is always possible to predict
the
behaviour of any piece of matter, right? And living matter is still matter,
which
obeys all of the physical laws all of the time, right? It appeared from your
previous
posts that you would disagree wit
Brent Meeker wrote:
> If consciousness is the creation of an inner narrative
> to be stored in long-term memory then there are levels
> of consciousness. The amoeba forms no memories and so
> is not conscious at all. A dog forms memories and even
> has some understanding of symbols (gestures,
Stathis said
<>
> and Colin has said that he does not believe that philosophical zombies
can exist.
> Hence, he has to show not only that the computer model will lack the 1st
person
> experience, but also lack the 3rd person observable behaviour of the
real thing;
> and the latter can only be
Stathis said
>
> I'll let Colin answer, but it seems to me he must say that some aspect of
> brain
> physics deviates from what the equations tell us (and deviates in an
> unpredictable
> way, otherwise it would just mean that different equations are required)
> to be
> consistent. If not, the
>
> I'm not sure of the details of your experiments, but wouldn't the most
> direct way to prove what you are saying be to isolate just
> that physical process
> which cannot be modelled? For example, if it is EM fields, set up an
> appropriately
> brain-like configuration of EM fields, introduce
James N Rose wrote:
> Just to throw a point of perspective into this
> conversation about mimicking qualia.
>
> I posed a thematic question in my 1992 opus
> "Understanding the Integral Universe".
>
> "What of a single celled animus like an amoeba or paramecium?
> Does it 'feel' itself? Does
Just to throw a point of perspective into this
conversation about mimicking qualia.
I posed a thematic question in my 1992 opus
"Understanding the Integral Universe".
"What of a single celled animus like an amoeba or paramecium?
Does it 'feel' itself? Does it sense the subtle variations
in i
Colin Geoffrey Hales wrote:
> Stathis wrote:
> I can understand that, for example, a computer simulation of a storm is
> not a storm, because only a storm is a storm and will get you wet. But
> perhaps counterintuitively, a model of a brain can be closer to the real
> thing than a model of a stor
Well this is fascinating! I tend to think that Brent's 'simplistic'
approach of setting up oscillating EM fields of specific frequencies at
specific locations is more likely to be good evidence of EM involvement
in qualia, because the victim, I mean experimental subject, can relate
what is hap
Brent Meeker wrote:
> Stathis Papaioannou wrote:
> >
> > Colin Hales writes:
> >
> >>> I understand your conclusion, that a model of a brain
> >>> won't be able to handle novelty like a real brain,
> >>> but I am trying to understand the nuts and
> >>> bolts of how the model is going to fail. For
Colin Geoffrey Hales wrote:
> >
> > I understand your conclusion, that a model of a brain
> > won't be able to handle novelty like a real brain,
> > but I am trying to understand the nuts and
> > bolts of how the model is going to fail. For
> > example, you can say that perpetual motion
> > machi
Brent Meeker writes:
[Colin]
> >> So I guess my proclaimations about models are all contingent on my own
> >> view of things...and I could be wrong. Only time will tell. I have good
> >> physical grounds to doubt that modelling can work and I have a way of
> >> testing it. So at least it can be
20 matches
Mail list logo