YKY> On 6/19/07, Eric Baum <[EMAIL PROTECTED]> wrote:
>> The modern feature is that whole peoples have chosen to reproduce
>> at half replacement level. In case you haven't thought about the
>> implications of that, that means their genes, for example, are
>> vanishing from the pool by a factor of
On 6/19/07, Eric Baum <[EMAIL PROTECTED]> wrote:
The modern feature is that whole peoples have chosen to reproduce at
half replacement level. In case you haven't thought about the
implications of that, that means their genes, for example, are
vanishing from the pool by a factor of 2 every 20 yea
Charles> N.B.: People have practiced birth control as far back as we
Charles> have information. Look into the story of Oedipus Rex. Study
Charles> the histories of the Polynesians. The only modern feature is
Charles> that we are now allowing the practice to occur before the
Charles> investment
Eric Baum wrote:
...
Evolution does not produce optimal programs, only very good ones.
Also the optimal solution for a complex problem will not on most
complex problems do what might be thought the optimal thing on
every instance. A simple example is the max flow problem, in which the
optimal f
Eric Baum wrote:
>> ... I claim that it is the very fact that you are making decisions
>> about whether to supress pain for higher goals that is the reason
>> you are conscious of pain. Your consciousness is the computation of
>> a top-level decision making module (or perhaps system). If you were
On 6/18/07, Charles D Hixson <[EMAIL PROTECTED]> wrote:
Consider a terminal cancer patient.
It's not the actual weighing that causes consciousness of pain, it's the
implementation which normally allows such weighing. This, in my
opinion, *is* a design flaw. Your original statement is a more use
Eric Baum wrote:
Josh> On Saturday 16 June 2007 07:20:27 pm Matt Mahoney wrote:
--- Bo Morgan <[EMAIL PROTECTED]> wrote:
...
...
I claim that it is the very fact that you are making decisions about
whether to supress pain for higher goals that is the reason you are
conscious of pain. Your co
Eric: The difference between nondeterministic computation and deterministic
computation is a source of random numbers... Certainly, for modelling
purposes, it may well be fruitful to think
about the mind as running a non-deterministic program. I'm all in
favor of that. Definitely, when building
On Sunday 17 June 2007 07:53:38 am Eric Baum wrote:
> I claim that it is the very fact that you are making decisions about
> whether to supress pain for higher goals that is the reason you are
> conscious of pain. Your consciousness is the computation of a
> top-level decision making module (or pe
The difference between nondeterministic computation and deterministic
computation is a source of random numbers. Its a deep question in CS theory
whether this makes any difference-- or whether you can simulate a
nondeterministic computation using a pseudorandom number
generator. The difference is
I would claim that the specific nature of any quale, such as the
various nuanced pain sensations, depends (in fact, is the same thing
as) the code being run/ computation being performed when the quale
is perceived. I therefor don't
find it at all surprising that insects perceive pain differently,
Eric: I claim that it is the very fact that you are making decisions about
whether to supress pain for higher goals that is the reason you are
conscious of pain. Your consciousness is the computation of a
top-level decision making module (or perhaps system). If you were not
making decisions waying
Josh> On Saturday 16 June 2007 07:20:27 pm Matt Mahoney wrote:
>> --- Bo Morgan <[EMAIL PROTECTED]> wrote:
>>
>> >
>> > I haven't kept up with this thread. But I wanted to counter the
>> idea of a > simple ordering of painfulness.
Josh>
>> Can you give me an example?
>>
Josh> Anyone who
On Saturday 16 June 2007 07:20:27 pm Matt Mahoney wrote:
>
> --- Bo Morgan <[EMAIL PROTECTED]> wrote:
>
> >
> > I haven't kept up with this thread. But I wanted to counter the idea of a
> > simple ordering of painfulness.
>
> Can you give me an example?
>
Anyone who has played a compet
--- Bo Morgan <[EMAIL PROTECTED]> wrote:
>
> I haven't kept up with this thread. But I wanted to counter the idea of a
> simple ordering of painfulness.
>
> A simple ordering of painfulness is one way to think about pain that might
> work in some simple systems, where resources are allocated
I haven't kept up with this thread. But I wanted to counter the idea of a
simple ordering of painfulness.
A simple ordering of painfulness is one way to think about pain that might
work in some simple systems, where resources are allocated in a serial
fashion, but may not work in systems wher
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote:
> Eric,
>
> I'm not 100% sure if someone/something else than me feels pain, but
> considerable similarities between my and other humans
>
> - architecture
> - [triggers of] internal and external pain related responses
> - independent descriptions of s
Eric,
I'm not 100% sure if someone/something else than me feels pain, but
considerable similarities between my and other humans
- architecture
- [triggers of] internal and external pain related responses
- independent descriptions of subjective pain perceptions which
correspond in certain ways w
Jiri,
you are blind when it comes to my pain too.
In fact, you are blind when it comes to many sensations within your own
brain. Cut your corpus callosum, and the other half will have sensations
that you are blind to. Do you think they are not there now, before you
cut it?
>> If you use your br
If you use your brain as the read-write head in
a Turing machine in a "chinese room", "you" won't understand what's
going on, although understanding may very well take place. (cf chapter
3 of WIT?). Similarly, if you use your brain as the r-w head in a
Turing machine to run a program that feels pa
Jiri> Eric,
>> > Right. IMO roughly the same problem when processed by a >
>> computer..
>>
>> Why should you expect running a pain program on a computer to make
>> you feel pain any more than when I feel pain?
Jiri> I don't. The thought was: If we don't feel pain when processing
Jiri> software
Eric,
> Right. IMO roughly the same problem when processed by a
> computer..
Why should you expect running a pain program on a computer to make you
feel pain any more than when I feel pain?
I don't. The thought was: If we don't feel pain when processing
software in our pain-enabled minds, why
--- Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:
> http://www.goertzel.org/books/spirit/uni3.htm --> VIRTUAL ETHICS
The book chapter describes the need for ethics and cooperation in virtual
worlds, but does not address the question of whether machines can feel pain.
If you feel pain, you will ins
On 6/14/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
I don't believe this addresses the issue of machine pain. Ethics is a complex
function which evolves to increase the reproductive success of a society, for
example, by banning sexual practices that don't lead to reproduction. Ethics
also evol
--- Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:
> On 6/14/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> > I would avoid deleting all the files on my hard disk, but it has nothing
> to do
> > with pain or empathy.
> >
> > Let us separate the questions of pain and ethics. There are two
> indepe
On Thursday 14 June 2007 11:59:47 am Jiri Jelinek wrote:
> Well, if that's the case, shouldn't game makers stop making realistic
> computer games where human characters get hurt. ;-)
I very confidently expect that when (in the not too distant future) the NPCs
evoke pain as well as, say a video of
Jiri> Eric,
>> Running similar code at a conscious level won't generate your
^^
The key word here was "your".
Jiri> sensation of pain because its not called by the right routines
Jiri> and returning th
I was just playing with some thoughts on
potential security implications associated with the speculation of
qualia being produced as a side-effect of certain algorithmic
complexity on VNA.
Which is, in many ways, pretty similar to my assumption that consciousness
will be produced as a side-effe
On Thursday 14 June 2007 07:19:18 am Mark Waser wrote:
> Oh. You're stuck on qualia (and zombies). I haven't seen a good
> compact argument to convince you (and e-mail is too low band-width and
> non-interactive to do one of the longer ones). My apologies.
The best one-liner I know is, "P
James,
determine for some reason that the physical is truly missing something
Look at twin particles = just another example of something missing in
the world as we can see it.
Is it good enough to act and think and reason as if you have
experienced the feeling.
For AGI - yes. Why not (?).
Eric,
Running similar code at a conscious level won't generate your
sensation of pain because its not called by the right routines and
returning the right format results to the right calling instructions
in your homunculus program.
Right. IMO roughly the same problem when processed by a compu
Eric,
zombie that makes the same decisions as a human would be evaluating
similar code and would thus essentially have the same pain.
Well, if that's the case, shouldn't game makers stop making realistic
computer games where human characters get hurt. ;-)
Regards,
Jiri
-
This list is spo
Mark,
Oh. You're stuck on qualia (and zombies)
Sort of, but not really. There is no need for qualia in order to
develop powerful AGI. I was just playing with some thoughts on
potential security implications associated with the speculation of
qualia being produced as a side-effect of certain a
James> Do you know those 10-15 mentioned hard items? I agree with
James> your following thoughts on the matter.
Actually, I saw a posting where you had the same (or at least a very
similar) quote from Jackson, pain, itchiness, startling at loud
noises, smelling rose, etc.
-
This list is spo
Do you know those 10-15 mentioned hard items?
I agree with your following thoughts on the matter.
We have to seperate the mystical or spiritual from the physical, or determine
for some reason that the physical is truly missing something, that there is
something more than that is required for li
Jiri> Matt,
>> Here is a program that feels pain.
Jiri> I got the logic, but no pain when processing the code in my
Jiri> mind.
This is Frank Jackson's "Mary" fallacy, which I also debunk in WIT? Ch
14.
Running similar code at a conscious level won't generate your
sensation of pain because its
Jiri> James, Frank Jackson (in "Epiphenomenal Qualia") defined qualia
Jiri> as "...certain features of the bodily sensations especially, but
Jiri> also of certain perceptual experiences, which no amount of
Jiri> purely physical information includes.. :-)
One of the biggest problems with the philo
Jiri> Mark,
>> VNA..can simulate *any* substrate.
Jiri> I don't see any good reason for assuming that it would be
Jiri> anything more than a zombie.
Jiri> http://plato.stanford.edu/entries/zombies/
Zombie is another concept which seems to make perfect intuitive sense,
but IMO is not actually wel
;[EMAIL PROTECTED]>
To:
Sent: Wednesday, June 13, 2007 6:26 PM
Subject: Re: [agi] Pure reason is a disease.
Mark,
VNA..can simulate *any* substrate.
I don't see any good reason for assuming that it would be anything
more than a zombie.
http://plato.stanford.edu/entries/zombies/
unless
On 6/14/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
I would avoid deleting all the files on my hard disk, but it has nothing to do
with pain or empathy.
Let us separate the questions of pain and ethics. There are two independent
questions.
1. What mental or computational states correspond to
--- Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:
> On 6/13/07, Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:
> > On 6/13/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > >
> > > If yes, then how do you define pain in a machine?
> > >
> > A pain in a machine is the state in the machine that a person
>
Mark,
VNA..can simulate *any* substrate.
I don't see any good reason for assuming that it would be anything
more than a zombie.
http://plato.stanford.edu/entries/zombies/
unless you believe that there is some other magic involved
I would not call it magic, but we might have to look beyond
On 6/13/07, Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:
On 6/13/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
>
> If yes, then how do you define pain in a machine?
>
A pain in a machine is the state in the machine that a person
empathizing with the machine would avoid putting the machine into,
othe
On 6/13/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
If yes, then how do you define pain in a machine?
A pain in a machine is the state in the machine that a person
empathizing with the machine would avoid putting the machine into,
other things being equal (that is, when there is no higher goal
--- James Ratcliff <[EMAIL PROTECTED]> wrote:
> Whihc compiler did you use for Human OS V1.0?
> Didnt realize we had a CPP compiler out alreadyh
The purpose of my little pain-feeling program is to point out some of the
difficulties in applying ethics-for-humans to machines. The program h
Whihc compiler did you use for Human OS V1.0?
Didnt realize we had a CPP compiler out alreadyh
Jiri Jelinek <[EMAIL PROTECTED]> wrote: Matt,
>Here is a program that feels pain.
I got the logic, but no pain when processing the code in my mind.
Maybe you should mention in the pain.cpp desc
Matt,
Here is a program that feels pain.
I got the logic, but no pain when processing the code in my mind.
Maybe you should mention in the pain.cpp description that it needs to
be processed for long enough - so whatever is gonna process it, it
will eventually get to the 'I don't "feel" like do
feels pain, it is the entity
that the VNA is simulating that is feeling the pain.
Mark
- Original Message -
From: "Jiri Jelinek" <[EMAIL PROTECTED]>
To:
Sent: Monday, June 11, 2007 2:50 AM
Subject: Re: [agi] Pure reason is a disease.
Mark,
Again, simulation
Sure, until we give an AGI rights :}
Quote: I stand here today and will not abide the abusing of AGI rights!
Derek Zahn <[EMAIL PROTECTED]> wrote:P { margin:0px; padding:0px } body {
FONT-SIZE: 10pt; FONT-FAMILY:Tahoma } Matt Mahoney writes:
> Below is a program that can feel pain. It is
Yeah I looked a bit on the wiki about the "qualia" but was unable to find
anything concrete enough to comment on, seems to be some magical fluffery.
"bodily sensations" = input from touch stimuli
"perceptual experiences" = input information (data)
both of these we have and can process...
the las
And here's the human psuedocode:
1. Hold Knife above flame until red.
2. Place knife on arm.
3. a. Accept Pain sensation
b. Scream or respond as necessary
4. Press knife harder into skin.
5. Goto 3, until 6.
6. Pass out from pain
Matt Mahoney <[EMAIL PROTECTED]> wrote: Below is a program t
--- Derek Zahn <[EMAIL PROTECTED]> wrote:
> Matt Mahoney writes:> Below is a program that can feel pain. It is a
> simulation of a programmable> 2-input logic gate that you train using
> reinforcement conditioning.
> Is it ethical to compile and run this program?
Well, that is a good question.
On Monday 11 June 2007 03:22:04 pm Matt Mahoney wrote:
> /* pain.cpp - A program that can feel pleasure and pain.
> ...
Ouch! :-)
Josh
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=23
Here is a program that feels pain. It is a simulation of a 2-input logic gate
that you train by reinforcement learning. It "feels" in the sense that it
adjusts its behavior to avoid negative reinforcement from the user.
/* pain.cpp - A program that can feel pleasure and pain.
The program simul
Matt Mahoney writes:> Below is a program that can feel pain. It is a simulation
of a programmable> 2-input logic gate that you train using reinforcement
conditioning.
Is it ethical to compile and run this program?
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe
Below is a program that can feel pain. It is a simulation of a programmable
2-input logic gate that you train using reinforcement conditioning.
/* pain.cpp
This program simulates a programmable 2-input logic gate.
You train it by reinforcement conditioning. You provide a pair of
input bits (0
James,
Frank Jackson (in "Epiphenomenal Qualia") defined qualia as
"...certain features of the bodily sensations especially, but also of
certain perceptual experiences, which no amount of purely physical
information includes.. :-)
If it walks like a human, talks like a human, then for all those
Two different responses to this type of arguement.
Once you "simulate" something to the fact that we cant tell the difference
between it in any way, then it IS that something for most all intents and
purposes as far as the tests you have go.
If it walks like a human, talks like a human, then for
Mark,
Again, simulation - sure, why not. On VNA (Neumann's architecture) - I
don't think so - IMO not advanced enough to support qualia. Yes, I do
believe qualia exists (= I do not agree with all Dennett's views, but
I think his views are important to consider.) I wrote tons of pro
software (usin
> For feelings - like pain - there is a problem. But I don't feel like
> spending much time explaining it little by little through many emails.
> There are books and articles on this topic.
Indeed there are and they are entirely unconvincing. Anyone who writes
something can get it published.
I
Mark,
Could you specify some of those good reasons (i.e. why a sufficiently
large/fast enough von Neumann architecture isn't sufficient substrate
for a sufficiently complex mind to be conscious and feel -- or, at
least, to believe itself to be conscious and believe itself to feel
For being [/b
Yep. It's clear that modelling others in a social context was at least one of
the strong evolutionary drivers to human-level cognition. Reciprocal altruism
(in, e.g. bats) is strongly correlated with increased brain size (compared to
similar animals without it, e.g. other bats).
It's clearly to
On Jun 5, 2007, at 9:17 AM, J Storrs Hall, PhD wrote:
On Tuesday 05 June 2007 10:51:54 am Mark Waser wrote:
It's my belief/contention that a sufficiently complex mind will be
conscious
and feel -- regardless of substrate.
Sounds like Mike the computer in Moon is a Harsh Mistress
(Heinlei
On 6/3/07, Jiri Jelinek <[EMAIL PROTECTED]> wrote:
>Further, prove that pain (or more preferably sensation in general) isn't an
emergent property of sufficient complexity.
Talking about Neumann's architecture - I don't see how could increases
in complexity of rules used for switching Boolean val
Mark Waser writes:
> I think that morality (aka Friendliness) is directly on-topic for *any* AGI
> initiative; however, it's actually even more apropos for the approach that
> I'm taking.
> A very important part of what I'm proposing is attempting to deal with the
> fact that no two humans ag
On 6/5/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> Decisions are seen as increasingly moral to the extent that they enact
> principles assessed as promoting an increasing context of increasingly
> coherent values over increasing scope of consequences.
Or another question . . . . if I'm analyzi
> Decisions are seen as increasingly moral to the extent that they enact
> principles assessed as promoting an increasing context of increasingly
> coherent values over increasing scope of consequences.
Or another question . . . . if I'm analyzing an action based upon the criteria
specified above
>> Just a gentle suggestion: If you're planning to unveil a major AGI
>> initiative next month, focus on that at the moment.
I think that morality (aka Friendliness) is directly on-topic for *any* AGI
initiative; however, it's actually even more apropos for the approach that I'm
taking.
>> As
Mark Waser writes:
> BTW, with this definition of morality, I would argue that it is a very rare
> human that makes moral decisions any appreciable percent of the time
Just a gentle suggestion: If you're planning to unveil a major AGI initiative
next month, focus on that at the moment. T
those that do have ingrained it as reflex -- so do those reflexes count as
moral decisions? Or are they not moral since they're not conscious decisions
at the time of choice?:-).
Mark
- Original Message -
From: "Jef Allbright" <[EMAIL PROTECTED]>
To:
On 6/5/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> I would not claim that agency requires consciousness; it is necessary
> only that an agent acts on its environment so as to minimize the
> difference between the external environment and its internal model of
> the preferred environment
OK.
> M
I would not claim that agency requires consciousness; it is necessary
only that an agent acts on its environment so as to minimize the
difference between the external environment and its internal model of
the preferred environment
OK.
Moral agency, however, requires both agency and self-awaren
On 6/5/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> I do think its a misuse of "agency" to ascribe moral agency to what is
> effectively only a tool. Even a human, operating under duress, i.e.
> as a tool for another, should be considered as having diminished or no
> moral agency, in my opinion
> I do think its a misuse of "agency" to ascribe moral agency to what is
> effectively only a tool. Even a human, operating under duress, i.e.
> as a tool for another, should be considered as having diminished or no
> moral agency, in my opinion.
So, effectively, it sounds like agency requires bo
On 6/5/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> Isn't it indisputable that agency is necessarily on behalf of some
> perceived entity (a self) and that assessment of the "morality" of any
> decision is always only relative to a subjective model of "rightness"?
I'm not sure that I should dive
Isn't it indisputable that agency is necessarily on behalf of some
perceived entity (a self) and that assessment of the "morality" of any
decision is always only relative to a subjective model of "rightness"?
I'm not sure that I should dive into this but I'm not the brightest
sometimes . . . .
On 6/5/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> I think a system can get arbitrarily complex without being conscious --
> consciousness is a specific kind of model-based, summarizing,
> self-monitoring
> architecture.
Yes. That is a good clarification of what I meant rather than what I said
I think a system can get arbitrarily complex without being conscious --
consciousness is a specific kind of model-based, summarizing,
self-monitoring
architecture.
Yes. That is a good clarification of what I meant rather than what I said.
That said, I think consciousness is necessary
but no
On Tuesday 05 June 2007 10:51:54 am Mark Waser wrote:
> It's my belief/contention that a sufficiently complex mind will be conscious
> and feel -- regardless of substrate.
Sounds like Mike the computer in Moon is a Harsh Mistress (Heinlein). Note,
btw, that Mike could be programmed in Loglan (pr
To get any further with "feelings" you again have to have a better definition
and examples of what you are dealing with.
In humans, most "feelings" and emotions are brought about by chemical changes
in the body yes? Then from there it becomes "knowledge" in the brain, which we
use to make deci
Your brain can be simulated on a large/fast enough von Neumann
architecture.
From the behavioral perspective (which is good enough for AGI) - yes,
but that's not the whole story when it comes to human brain. In our
brains, information not only "is" and "moves" but also "feels".
It's my belief/c
Hi Mark,
Your brain can be simulated on a large/fast enough von Neumann architecture.
From the behavioral perspective (which is good enough for AGI) - yes,
but that's not the whole story when it comes to human brain. In our
brains, information not only "is" and "moves" but also "feels". From
What component do you have that can't exist in
a von Neumann architecture?
Brain :)
Your brain can be simulated on a large/fast enough von Neumann architecture.
Agreed, your PC cannot feel pain. Are you sure, however, that an entity
hosted/simulated on your PC doesn't/can't?
If the hardw
Mark,
I agree that one cannot guarantee that his AGI source code + some
potentially dangerous data are not gonna end up in wrong hands (if
that's where you are getting). But when that happens, how exactly are
your security controls gonna help? I mean your built-in layered
defense strategy" / mora
I think it is a serious mistake for anyone to say that the difference
between machines cannot in principle experience real feelings.
We are complex machines, so yes, machines can, but my PC cannot, even
though it can power AGI.
Agreed, your PC cannot feel pain. Are you sure, however, that an
nsation before you get complex enough to be generally
intelligent.
Mark
- Original Message -----
From: "Jiri Jelinek" <[EMAIL PROTECTED]>
To:
Sent: Saturday, May 26, 2007 4:20 AM
Subject: Re: [agi] Pure reason is a disease.
Mark,
If Google came along and offered you
Richard,
I think it is a serious mistake for anyone to say that the difference
between machines cannot in principle experience real feelings.
We are complex machines, so yes, machines can, but my PC cannot, even
though it can power AGI.
Regards,
Jiri
-
This list is sponsored by AGIRI: ht
Mark,
If Google came along and offered you $10 million for your AGI, would you
give it to them?
No, I would sell services.
How about the Russian mob for $1M and your life and the
lives of your family?
How about FBI? No? So maybe selling him a messed up version for $2M
and then hiring a ski
You possibly already know this and are simplifying for the sake of
simplicity, but chemicals are not simply global environmental
settings.
Chemicals/hormones/peptides etc. are spatial concentration gradients
across the entire brain, which are much more difficult to emulate in
software then a sing
On 5/25/07, Mark Waser <[EMAIL PROTECTED]> wrote:
> Sophisticated logical
> structures (at least in our bodies) are not enough for actual
> feelings. For example, to feel pleasure, you also need things like
> serotonin, acetylcholine, noradrenaline, glutamate, enkephalins and
> endorphins. World
Jiri> Note that some people suffer from rare
Jiri> disorders that prevent them from the sensation of pain
Jiri> (e.g. congenital insensitivity to pain).
What that tells you is that the sensation you feel is genetically
programmed. Break the program, you break (or change) the sensation.
Run the
Note that some people suffer from rare disorders that prevent them
from the sensation of pain (e.g. congenital insensitivity to pain).
the pain info doesn't even make it to the brain because of
malfunctioning nerve cells which are responsible for transmitting the
pain signals (caused by genetic
Josh> I think that people have this notion that because emotions are
Josh> so unignorable and compelling subjectively, that they must be
Josh> complex. In fact the body's contribution, in an information
Josh> theoretic sense, is tiny -- I'm sure I way overestimate it with
Josh> the 1%.
Emotions
Mark,
I cannot hit everything now, so at least one part:
Are you *absolutely positive* that "real pain and real
feelings" aren't an emergent phenomenon of sufficiently complicated and
complex feedback loops? Are you *really sure* that a sufficiently
sophisticated AGI won't experience pain?
E
On Wednesday 23 May 2007 06:34:29 pm Mike Tintner wrote:
> My underlying argument, though, is that your (or any) computational model
> of emotions, if it does not also include a body, will be fundamentally
> flawed both physically AND computationally.
Does everyone here know what an ICE is in
Eric,
The point is simply that you can only fully simulate emotions with a body as
well as a brain. And emotions while identified by the conscious brain are
felt with the body
I don't find it at all hard to understand - I fully agree - that emotions
are generated as a result of computations
P.S. Eric, I haven't forgotten your question to me, & will try to address it
in time - the answer is complex.
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a
Mike> Eric Baum: What is Thought [claims that] feelings.are
Mike> explainable by a computational model.
Mike> Feelings/ emotions are generated by the brain's computations,
Mike> certainly. But they are physical/ body events. Does your Turing
Mike> machine have a body other than that of some kind
On May 23, 2007, at 3:02 PM, Mike Tintner wrote:
Feelings/ emotions are generated by the brain's computations,
certainly. But they are physical/ body events. Does your Turing
machine have a body other than that of some kind of computer box?
And does it want to dance when it hears emotionall
>> AGIs (at least those that could run on current computers) cannot
>> really get excited about anything. It's like when you represent the
>> pain intensity with a number. No matter how high the number goes,
>> it doesn't really hurt. Real feelings - that's the key difference
>> between us and the
1 - 100 of 127 matches
Mail list logo