Brent Meeker wrote:
> > That notion may fit comfortably with your presumptive
> > ideas about 'memory' -- computer stored, special-neuron
> > stored, and similar. But the universe IS ITSELF 'memory
> > storage' from the start. Operational rules of performance
> > -- the laws of nature, so to
>
> Colin,
>
> You have described a way in which our perception may be more than can
> be explained by the sense data. However, how does this explain the
> response
> to novelty? I can come up with a plan or theory to deal with a novel
> situation
> if it is simply described to me. I don't have to
Colin Geoffrey Hales wrote:
> What I expect to happen is that the field configuration I find emerging in
> the guts of the chips will be different, depending on the object, even
> though the sensory measurement is identical. The different field
> configurations will correspond to the different o
ually perceive anything.
Writers,
philosophers, mathematicians can all be creative without perceiving anything.
Stathis Papaioannou
> Date: Mon, 18 Dec 2006 10:54:05 +1100
> From: [EMAIL PROTECTED]
> Subject: RE: computer pain
> To
James N Rose wrote:
>
>
> Brent Meeker wrote:
>
>> If consciousness is the creation of an inner narrative
>> to be stored in long-term memory then there are levels
>> of consciousness. The amoeba forms no memories and so
>> is not conscious at all. A dog forms memories and even
>> has some un
Stahis said:
> If you present an object with "identical sensory measurements" but get
different results in the chip, then that means what you took as "sensory
measurements" was incomplete. For example, blind people might be able to
sense the presense of someone who silently walks into the room du
Colin Hales writes:
>
> Stathis said
> <>
> > and Colin has said that he does not believe that philosophical zombies
> can exist.
> > Hence, he has to show not only that the computer model will lack the 1st
> person
> > experience, but also lack the 3rd person observable behaviour of the
>
apaioannou
> Date: Mon, 18 Dec 2006 07:42:38 +1100
> From: [EMAIL PROTECTED]
> Subject: RE: computer pain
> To: everything-list@googlegroups.com
>
>
> Stathis said
> >
> > I'll let Colin answer, but it seems to me he must say that some aspect of
> > b
.
Stathis
> Date: Mon, 18 Dec 2006 07:17:10 +1100
> From: [EMAIL PROTECTED]
> Subject: RE: computer pain
> To: everything-list@googlegroups.com
>
>
> >
> > I'm not sure of the details of your experiments, but wouldn
Brent Meeker wrote:
> If consciousness is the creation of an inner narrative
> to be stored in long-term memory then there are levels
> of consciousness. The amoeba forms no memories and so
> is not conscious at all. A dog forms memories and even
> has some understanding of symbols (gestures,
Stathis said
<>
> and Colin has said that he does not believe that philosophical zombies
can exist.
> Hence, he has to show not only that the computer model will lack the 1st
person
> experience, but also lack the 3rd person observable behaviour of the
real thing;
> and the latter can only be
Stathis said
>
> I'll let Colin answer, but it seems to me he must say that some aspect of
> brain
> physics deviates from what the equations tell us (and deviates in an
> unpredictable
> way, otherwise it would just mean that different equations are required)
> to be
> consistent. If not, the
>
> I'm not sure of the details of your experiments, but wouldn't the most
> direct way to prove what you are saying be to isolate just
> that physical process
> which cannot be modelled? For example, if it is EM fields, set up an
> appropriately
> brain-like configuration of EM fields, introduce
James N Rose wrote:
> Just to throw a point of perspective into this
> conversation about mimicking qualia.
>
> I posed a thematic question in my 1992 opus
> "Understanding the Integral Universe".
>
> "What of a single celled animus like an amoeba or paramecium?
> Does it 'feel' itself? Does
Just to throw a point of perspective into this
conversation about mimicking qualia.
I posed a thematic question in my 1992 opus
"Understanding the Integral Universe".
"What of a single celled animus like an amoeba or paramecium?
Does it 'feel' itself? Does it sense the subtle variations
in i
Colin Geoffrey Hales wrote:
> Stathis wrote:
> I can understand that, for example, a computer simulation of a storm is
> not a storm, because only a storm is a storm and will get you wet. But
> perhaps counterintuitively, a model of a brain can be closer to the real
> thing than a model of a stor
Well this is fascinating! I tend to think that Brent's 'simplistic'
approach of setting up oscillating EM fields of specific frequencies at
specific locations is more likely to be good evidence of EM involvement
in qualia, because the victim, I mean experimental subject, can relate
what is hap
Brent Meeker wrote:
> Stathis Papaioannou wrote:
> >
> > Colin Hales writes:
> >
> >>> I understand your conclusion, that a model of a brain
> >>> won't be able to handle novelty like a real brain,
> >>> but I am trying to understand the nuts and
> >>> bolts of how the model is going to fail. For
Colin Geoffrey Hales wrote:
> >
> > I understand your conclusion, that a model of a brain
> > won't be able to handle novelty like a real brain,
> > but I am trying to understand the nuts and
> > bolts of how the model is going to fail. For
> > example, you can say that perpetual motion
> > machi
Brent Meeker writes:
[Colin]
> >> So I guess my proclaimations about models are all contingent on my own
> >> view of things...and I could be wrong. Only time will tell. I have good
> >> physical grounds to doubt that modelling can work and I have a way of
> >> testing it. So at least it can be
Stathis Papaioannou wrote:
>
> Colin Hales writes:
>
>>> I understand your conclusion, that a model of a brain
>>> won't be able to handle novelty like a real brain,
>>> but I am trying to understand the nuts and
>>> bolts of how the model is going to fail. For
>>> example, you can say that perp
Colin Hales writes:
> > I understand your conclusion, that a model of a brain
> > won't be able to handle novelty like a real brain,
> > but I am trying to understand the nuts and
> > bolts of how the model is going to fail. For
> > example, you can say that perpetual motion
> > machines are imp
>
> I understand your conclusion, that a model of a brain
> won't be able to handle novelty like a real brain,
> but I am trying to understand the nuts and
> bolts of how the model is going to fail. For
> example, you can say that perpetual motion
> machines are impossible because they disobey
> t
>
> So the EM fields account for the experiences that accompany the brain
processes. A kind of epiphenomena.
>
> So why don't my experiences change when I'm in an MRI?
>
I haven't been through the detail - I hope to verify this in my
simulations to come but...
As far as I am aware MRI magne
Colin Hales writes:
> Stathis wrote:
> I can understand that, for example, a computer simulation of a storm is
> not a storm, because only a storm is a storm and will get you wet. But
> perhaps counterintuitively, a model of a brain can be closer to the real
> thing than a model of a storm. We d
Colin Geoffrey Hales wrote:
>> So your theory is that the electromagnetic field has an ability to learn
> which is not reflected in QED - it's some hitherto unknown aspect of the
> field and it doesn't show up in the field violating Maxwell's equations
> or
>> QED predictions? And further this as
>
> So your theory is that the electromagnetic field has an ability to learn
which is not reflected in QED - it's some hitherto unknown aspect of the
field and it doesn't show up in the field violating Maxwell's equations
or
> QED predictions? And further this aspect of the EM field is able to
ef
Colin Geoffrey Hales wrote:
> Stathis wrote:
> I can understand that, for example, a computer simulation of a storm is
> not a storm, because only a storm is a storm and will get you wet. But
> perhaps counterintuitively, a model of a brain can be closer to the real
> thing than a model of a storm
Stathis wrote:
I can understand that, for example, a computer simulation of a storm is
not a storm, because only a storm is a storm and will get you wet. But
perhaps counterintuitively, a model of a brain can be closer to the real
thing than a model of a storm. We don't normally see inside a perso
Brent said:
> Of course they describe things - they aren't the things themselves.
> But the question is whether the description is complete. Is there
> anything about EM fields that is not described by QED?
Absolutely HEAPS! Everything that they are made of and how the components
inteact to mak
Colin Geoffrey Hales wrote:
>> So you are saying the special something which causes
>> consciousness and which functionalism has ignored
>> is the electric field around the neuron/astrocyte.
>> But electric fields were well understood even a
>> hundred years ago, weren't they? Why couldn't
>> a ne
Colin,
I can understand that, for example, a computer simulation of a storm is not a
storm,
because only a storm is a storm and will get you wet. But perhaps
counterintuitively,
a model of a brain can be closer to the real thing than a model of a storm. We
don't
normally see inside a perso
>
> So you are saying the special something which causes
> consciousness and which functionalism has ignored
> is the electric field around the neuron/astrocyte.
> But electric fields were well understood even a
> hundred years ago, weren't they? Why couldn't
> a neuron be simulated by something l
Colin Hales writes:
> There's a whole axis of modelling orthogonal to the soma membrane which
> gets statistically abstracted out by traditional Hodkin/Huxley models. The
> neuron becomes geometry-less (except for when the HH model is made into
> 'cable'/compartmental equivalents for longitudi
Stathis Papaioannou wrote:
>
> Brent meeker writes:
>> Stathis Papaioannou wrote:
>>> Brent Meeker writes:
>>>
I would say that many complex mechanical systems react to "pain" in a way
similar to simple animals. For example, aircraft have automatic shut
downs and fire extinguish
be pathetic. It would have no clue
where it was and learn nothing looking remotely normal. Meanwhile Marvin
inside can do perfectly good 'zombie room' science.
RE: Computer Pain
There's a whole axis of modelling orthogonal to the soma membrane which
gets statistically abstracted out by
Yes Stathis, you are right, 'noxious stimulus' and
'experience' are indeed separable - but - if you want to
do an analysis of comparing, its important to identify
global parameters and potential analogs.
My last post's example tried to address those components.
I've seen stress diagrams of diff
gt; From: [EMAIL PROTECTED]
> Subject: RE: computer pain
> To: everything-list@googlegroups.com
>
>
> Hi Stathis/Jamie et al.
> I've been busy else where in self-preservation mode deleting emails
> madly .frustrating, with so many threads left hanging...oh well...but
&g
Brent meeker writes:
>
> Stathis Papaioannou wrote:
> >
> > Brent Meeker writes:
> >
> >> I would say that many complex mechanical systems react to "pain" in a way
> >> similar to simple animals. For example, aircraft have automatic shut
> >> downs and fire extinguishers. They can change t
Jamie Rose writes:
> Stathis,
>
> As I was reading your comments this morning, an example
> crossed my mind that might fit your description of in-place
> code lines that monitor 'disfunction' and exist in-situ as
> a 'pain' alert .. that would be error evaluating 'check-sum'
> computations.
>
Hi Stathis/Jamie et al.
I've been busy else where in self-preservation mode deleting emails
madly .frustrating, with so many threads left hanging...oh well...but
I couldn't go past this particular dialog.
I am having trouble that you actually believe the below to be the case!
Lines of cod
Stathis Papaioannou wrote:
>
> Brent Meeker writes:
>
>> I would say that many complex mechanical systems react to "pain" in a way
>> similar to simple animals. For example, aircraft have automatic shut downs
>> and fire extinguishers. They can change the flight controls to reduce
>> stress
Stathis,
As I was reading your comments this morning, an example
crossed my mind that might fit your description of in-place
code lines that monitor 'disfunction' and exist in-situ as
a 'pain' alert .. that would be error evaluating 'check-sum'
computations.
In a functional way, parallel check-s
Brent Meeker writes:
> I would say that many complex mechanical systems react to "pain" in a way
> similar to simple animals. For example, aircraft have automatic shut downs
> and fire extinguishers. They can change the flight controls to reduce stress
> on structures. Whether they feel th
st@googlegroups.com
> Subject: Re: computer pain
>
>
> Stathis,
>
> The reason for lack of responses is that your idea
> goes directly to illuminating why AI systems - as
> promoulgated under current designs of software
> running in hardware matrices - CANNOT emulate li
James N Rose wrote:
> Stathis,
>
> The reason for lack of responses is that your idea
> goes directly to illuminating why AI systems - as
> promoulgated under current designs of software
> running in hardware matrices - CANNOT emulate living
> systems. It an issue that AI advocates intuitively
Stathis,
The reason for lack of responses is that your idea
goes directly to illuminating why AI systems - as
promoulgated under current designs of software
running in hardware matrices - CANNOT emulate living
systems. It an issue that AI advocates intuitively
and scrupulously AVOID.
"Pain" in
No responses yet to this question. It seems to me a straightforward
consequence of computationalism that we should be able to write a program
which, when run, will experience pain, and I suspect that this would be a
substantially simpler program than one demonstrating general intelligence. It
101 - 148 of 148 matches
Mail list logo