Re: computer pain

2006-12-17 Thread James N Rose
Brent Meeker wrote: > > That notion may fit comfortably with your presumptive > > ideas about 'memory' -- computer stored, special-neuron > > stored, and similar. But the universe IS ITSELF 'memory > > storage' from the start. Operational rules of performance > > -- the laws of nature, so to

RE: computer pain

2006-12-17 Thread Colin Geoffrey Hales
> > Colin, > > You have described a way in which our perception may be more than can > be explained by the sense data. However, how does this explain the > response > to novelty? I can come up with a plan or theory to deal with a novel > situation > if it is simply described to me. I don't have to

Re: computer pain

2006-12-17 Thread 1Z
Colin Geoffrey Hales wrote: > What I expect to happen is that the field configuration I find emerging in > the guts of the chips will be different, depending on the object, even > though the sensory measurement is identical. The different field > configurations will correspond to the different o

RE: computer pain

2006-12-17 Thread Stathis Papaioannou
ually perceive anything. Writers, philosophers, mathematicians can all be creative without perceiving anything. Stathis Papaioannou > Date: Mon, 18 Dec 2006 10:54:05 +1100 > From: [EMAIL PROTECTED] > Subject: RE: computer pain > To

Re: computer pain

2006-12-17 Thread Brent Meeker
James N Rose wrote: > > > Brent Meeker wrote: > >> If consciousness is the creation of an inner narrative >> to be stored in long-term memory then there are levels >> of consciousness. The amoeba forms no memories and so >> is not conscious at all. A dog forms memories and even >> has some un

RE: computer pain

2006-12-17 Thread Colin Geoffrey Hales
Stahis said: > If you present an object with "identical sensory measurements" but get different results in the chip, then that means what you took as "sensory measurements" was incomplete. For example, blind people might be able to sense the presense of someone who silently walks into the room du

RE: computer pain

2006-12-17 Thread Stathis Papaioannou
Colin Hales writes: > > Stathis said > <> > > and Colin has said that he does not believe that philosophical zombies > can exist. > > Hence, he has to show not only that the computer model will lack the 1st > person > > experience, but also lack the 3rd person observable behaviour of the >

RE: computer pain

2006-12-17 Thread Stathis Papaioannou
apaioannou > Date: Mon, 18 Dec 2006 07:42:38 +1100 > From: [EMAIL PROTECTED] > Subject: RE: computer pain > To: everything-list@googlegroups.com > > > Stathis said > > > > I'll let Colin answer, but it seems to me he must say that some aspect of > > b

RE: computer pain

2006-12-17 Thread Stathis Papaioannou
. Stathis > Date: Mon, 18 Dec 2006 07:17:10 +1100 > From: [EMAIL PROTECTED] > Subject: RE: computer pain > To: everything-list@googlegroups.com > > > > > > I'm not sure of the details of your experiments, but wouldn

Re: computer pain

2006-12-17 Thread James N Rose
Brent Meeker wrote: > If consciousness is the creation of an inner narrative > to be stored in long-term memory then there are levels > of consciousness. The amoeba forms no memories and so > is not conscious at all. A dog forms memories and even > has some understanding of symbols (gestures,

RE: computer pain

2006-12-17 Thread Colin Geoffrey Hales
Stathis said <> > and Colin has said that he does not believe that philosophical zombies can exist. > Hence, he has to show not only that the computer model will lack the 1st person > experience, but also lack the 3rd person observable behaviour of the real thing; > and the latter can only be

RE: computer pain

2006-12-17 Thread Colin Geoffrey Hales
Stathis said > > I'll let Colin answer, but it seems to me he must say that some aspect of > brain > physics deviates from what the equations tell us (and deviates in an > unpredictable > way, otherwise it would just mean that different equations are required) > to be > consistent. If not, the

RE: computer pain

2006-12-17 Thread Colin Geoffrey Hales
> > I'm not sure of the details of your experiments, but wouldn't the most > direct way to prove what you are saying be to isolate just > that physical process > which cannot be modelled? For example, if it is EM fields, set up an > appropriately > brain-like configuration of EM fields, introduce

Re: computer pain

2006-12-17 Thread Brent Meeker
James N Rose wrote: > Just to throw a point of perspective into this > conversation about mimicking qualia. > > I posed a thematic question in my 1992 opus > "Understanding the Integral Universe". > > "What of a single celled animus like an amoeba or paramecium? > Does it 'feel' itself? Does

Re: computer pain

2006-12-17 Thread James N Rose
Just to throw a point of perspective into this conversation about mimicking qualia. I posed a thematic question in my 1992 opus "Understanding the Integral Universe". "What of a single celled animus like an amoeba or paramecium? Does it 'feel' itself? Does it sense the subtle variations in i

Re: computer pain

2006-12-17 Thread 1Z
Colin Geoffrey Hales wrote: > Stathis wrote: > I can understand that, for example, a computer simulation of a storm is > not a storm, because only a storm is a storm and will get you wet. But > perhaps counterintuitively, a model of a brain can be closer to the real > thing than a model of a stor

Re: computer pain

2006-12-17 Thread Mark Peaty
Well this is fascinating! I tend to think that Brent's 'simplistic' approach of setting up oscillating EM fields of specific frequencies at specific locations is more likely to be good evidence of EM involvement in qualia, because the victim, I mean experimental subject, can relate what is hap

Re: computer pain

2006-12-17 Thread 1Z
Brent Meeker wrote: > Stathis Papaioannou wrote: > > > > Colin Hales writes: > > > >>> I understand your conclusion, that a model of a brain > >>> won't be able to handle novelty like a real brain, > >>> but I am trying to understand the nuts and > >>> bolts of how the model is going to fail. For

Re: computer pain

2006-12-17 Thread 1Z
Colin Geoffrey Hales wrote: > > > > I understand your conclusion, that a model of a brain > > won't be able to handle novelty like a real brain, > > but I am trying to understand the nuts and > > bolts of how the model is going to fail. For > > example, you can say that perpetual motion > > machi

RE: computer pain

2006-12-17 Thread Stathis Papaioannou
Brent Meeker writes: [Colin] > >> So I guess my proclaimations about models are all contingent on my own > >> view of things...and I could be wrong. Only time will tell. I have good > >> physical grounds to doubt that modelling can work and I have a way of > >> testing it. So at least it can be

Re: computer pain

2006-12-16 Thread Brent Meeker
Stathis Papaioannou wrote: > > Colin Hales writes: > >>> I understand your conclusion, that a model of a brain >>> won't be able to handle novelty like a real brain, >>> but I am trying to understand the nuts and >>> bolts of how the model is going to fail. For >>> example, you can say that perp

RE: computer pain

2006-12-16 Thread Stathis Papaioannou
Colin Hales writes: > > I understand your conclusion, that a model of a brain > > won't be able to handle novelty like a real brain, > > but I am trying to understand the nuts and > > bolts of how the model is going to fail. For > > example, you can say that perpetual motion > > machines are imp

RE: computer pain

2006-12-16 Thread Colin Geoffrey Hales
> > I understand your conclusion, that a model of a brain > won't be able to handle novelty like a real brain, > but I am trying to understand the nuts and > bolts of how the model is going to fail. For > example, you can say that perpetual motion > machines are impossible because they disobey > t

Re: computer pain

2006-12-16 Thread Colin Geoffrey Hales
> > So the EM fields account for the experiences that accompany the brain processes. A kind of epiphenomena. > > So why don't my experiences change when I'm in an MRI? > I haven't been through the detail - I hope to verify this in my simulations to come but... As far as I am aware MRI magne

RE: computer pain

2006-12-16 Thread Stathis Papaioannou
Colin Hales writes: > Stathis wrote: > I can understand that, for example, a computer simulation of a storm is > not a storm, because only a storm is a storm and will get you wet. But > perhaps counterintuitively, a model of a brain can be closer to the real > thing than a model of a storm. We d

Re: computer pain

2006-12-16 Thread Brent Meeker
Colin Geoffrey Hales wrote: >> So your theory is that the electromagnetic field has an ability to learn > which is not reflected in QED - it's some hitherto unknown aspect of the > field and it doesn't show up in the field violating Maxwell's equations > or >> QED predictions? And further this as

Re: computer pain

2006-12-15 Thread Colin Geoffrey Hales
> > So your theory is that the electromagnetic field has an ability to learn which is not reflected in QED - it's some hitherto unknown aspect of the field and it doesn't show up in the field violating Maxwell's equations or > QED predictions? And further this aspect of the EM field is able to ef

Re: computer pain

2006-12-15 Thread Brent Meeker
Colin Geoffrey Hales wrote: > Stathis wrote: > I can understand that, for example, a computer simulation of a storm is > not a storm, because only a storm is a storm and will get you wet. But > perhaps counterintuitively, a model of a brain can be closer to the real > thing than a model of a storm

RE: computer pain

2006-12-15 Thread Colin Geoffrey Hales
Stathis wrote: I can understand that, for example, a computer simulation of a storm is not a storm, because only a storm is a storm and will get you wet. But perhaps counterintuitively, a model of a brain can be closer to the real thing than a model of a storm. We don't normally see inside a perso

Re: computer pain

2006-12-15 Thread Colin Geoffrey Hales
Brent said: > Of course they describe things - they aren't the things themselves. > But the question is whether the description is complete. Is there > anything about EM fields that is not described by QED? Absolutely HEAPS! Everything that they are made of and how the components inteact to mak

Re: computer pain

2006-12-15 Thread Brent Meeker
Colin Geoffrey Hales wrote: >> So you are saying the special something which causes >> consciousness and which functionalism has ignored >> is the electric field around the neuron/astrocyte. >> But electric fields were well understood even a >> hundred years ago, weren't they? Why couldn't >> a ne

RE: computer pain

2006-12-15 Thread Stathis Papaioannou
Colin, I can understand that, for example, a computer simulation of a storm is not a storm, because only a storm is a storm and will get you wet. But perhaps counterintuitively, a model of a brain can be closer to the real thing than a model of a storm. We don't normally see inside a perso

RE: computer pain

2006-12-15 Thread Colin Geoffrey Hales
> > So you are saying the special something which causes > consciousness and which functionalism has ignored > is the electric field around the neuron/astrocyte. > But electric fields were well understood even a > hundred years ago, weren't they? Why couldn't > a neuron be simulated by something l

RE: computer pain

2006-12-14 Thread Stathis Papaioannou
Colin Hales writes: > There's a whole axis of modelling orthogonal to the soma membrane which > gets statistically abstracted out by traditional Hodkin/Huxley models. The > neuron becomes geometry-less (except for when the HH model is made into > 'cable'/compartmental equivalents for longitudi

Re: computer pain

2006-12-14 Thread Brent Meeker
Stathis Papaioannou wrote: > > Brent meeker writes: >> Stathis Papaioannou wrote: >>> Brent Meeker writes: >>> I would say that many complex mechanical systems react to "pain" in a way similar to simple animals. For example, aircraft have automatic shut downs and fire extinguish

RE: computer pain

2006-12-14 Thread Colin Geoffrey Hales
be pathetic. It would have no clue where it was and learn nothing looking remotely normal. Meanwhile Marvin inside can do perfectly good 'zombie room' science. RE: Computer Pain There's a whole axis of modelling orthogonal to the soma membrane which gets statistically abstracted out by

Re: computer pain

2006-12-14 Thread James N Rose
Yes Stathis, you are right, 'noxious stimulus' and 'experience' are indeed separable - but - if you want to do an analysis of comparing, its important to identify global parameters and potential analogs. My last post's example tried to address those components. I've seen stress diagrams of diff

RE: computer pain

2006-12-14 Thread Stathis Papaioannou
gt; From: [EMAIL PROTECTED] > Subject: RE: computer pain > To: everything-list@googlegroups.com > > > Hi Stathis/Jamie et al. > I've been busy else where in self-preservation mode deleting emails > madly .frustrating, with so many threads left hanging...oh well...but &g

RE: computer pain

2006-12-14 Thread Stathis Papaioannou
Brent meeker writes: > > Stathis Papaioannou wrote: > > > > Brent Meeker writes: > > > >> I would say that many complex mechanical systems react to "pain" in a way > >> similar to simple animals. For example, aircraft have automatic shut > >> downs and fire extinguishers. They can change t

RE: computer pain

2006-12-14 Thread Stathis Papaioannou
Jamie Rose writes: > Stathis, > > As I was reading your comments this morning, an example > crossed my mind that might fit your description of in-place > code lines that monitor 'disfunction' and exist in-situ as > a 'pain' alert .. that would be error evaluating 'check-sum' > computations. >

RE: computer pain

2006-12-13 Thread Colin Geoffrey Hales
Hi Stathis/Jamie et al. I've been busy else where in self-preservation mode deleting emails madly .frustrating, with so many threads left hanging...oh well...but I couldn't go past this particular dialog. I am having trouble that you actually believe the below to be the case! Lines of cod

Re: computer pain

2006-12-13 Thread Brent Meeker
Stathis Papaioannou wrote: > > Brent Meeker writes: > >> I would say that many complex mechanical systems react to "pain" in a way >> similar to simple animals. For example, aircraft have automatic shut downs >> and fire extinguishers. They can change the flight controls to reduce >> stress

Re: computer pain

2006-12-13 Thread James N Rose
Stathis, As I was reading your comments this morning, an example crossed my mind that might fit your description of in-place code lines that monitor 'disfunction' and exist in-situ as a 'pain' alert .. that would be error evaluating 'check-sum' computations. In a functional way, parallel check-s

RE: computer pain

2006-12-13 Thread Stathis Papaioannou
Brent Meeker writes: > I would say that many complex mechanical systems react to "pain" in a way > similar to simple animals. For example, aircraft have automatic shut downs > and fire extinguishers. They can change the flight controls to reduce stress > on structures. Whether they feel th

RE: computer pain

2006-12-13 Thread Stathis Papaioannou
st@googlegroups.com > Subject: Re: computer pain > > > Stathis, > > The reason for lack of responses is that your idea > goes directly to illuminating why AI systems - as > promoulgated under current designs of software > running in hardware matrices - CANNOT emulate li

Re: computer pain

2006-12-13 Thread Brent Meeker
James N Rose wrote: > Stathis, > > The reason for lack of responses is that your idea > goes directly to illuminating why AI systems - as > promoulgated under current designs of software > running in hardware matrices - CANNOT emulate living > systems. It an issue that AI advocates intuitively

Re: computer pain

2006-12-12 Thread James N Rose
Stathis, The reason for lack of responses is that your idea goes directly to illuminating why AI systems - as promoulgated under current designs of software running in hardware matrices - CANNOT emulate living systems. It an issue that AI advocates intuitively and scrupulously AVOID. "Pain" in

RE: computer pain

2006-12-12 Thread Stathis Papaioannou
No responses yet to this question. It seems to me a straightforward consequence of computationalism that we should be able to write a program which, when run, will experience pain, and I suspect that this would be a substantially simpler program than one demonstrating general intelligence. It

<    1   2