On 2/9/2013 4:39 PM, Craig Weinberg wrote:


On Saturday, February 9, 2013 6:52:46 PM UTC-5, Brent wrote:

    On 2/9/2013 3:39 PM, Craig Weinberg wrote:


    On Saturday, February 9, 2013 6:29:54 PM UTC-5, Brent wrote:

        On 2/9/2013 3:08 PM, Craig Weinberg wrote:
        > Evolution would have no need for generating values, since values are a
        subjective
        > motivation.

        "Subjective motivation" is just a quantitative value seen from the 
inside.


    Why would quantitative values have an inside though? The only reason that 
we might
    presume that is because we are looking at it retrospectively. If you turn 
it around
    though, and assume quantitative mechanisms can exist without awareness, 
then there
    is no possibility of any interior experience being generated. How and why 
would
    such a thing arise?


        > All evolution would have to do is simply impose a script that assigns 
a high
        priority to
        > protecting ones own children and ones own life.

        And that's what happened and that's what you feel as love of life and 
love of
        children.


    I understand why that makes sense to you, but you are making that up by 
taking the
    undeniable existence of love and drawing a straight line to what you 
presume,
    unquestionably, to be the cause. It's an unfalsifiable misconception which 
begs the
    question. Lets say you wanted to make a computer program that did not feel
    anything, but just reproduced and survived. Are you suggesting that is 
impossible?

    Yes.  Just like a philosophical zombie is impossible because intelligence 
entails
    consciousness, goals and purposes (like survival) plus intelligence entails 
values
    and emotions.


It's circular reasoning. You are assuming that function is intelligence,

It's how I recognize intelligence - and so do you.

and then projecting your own human goals, purposes and consciousness onto that function. Then, realizing that your own consciousness doesn't make any sense as far as assisting function in any way

I don't 'realize' that - and neither do you. It's just another of your unsupported assumptions.

, so you affirm the consequent by concluding that there can't be a philosophical zombie. In reality, every machine that human beings have ever built is a potentially philosophical zombie, it's entirely up to the beholder who determines how deeply they subscribe to the pathetic fallacy.


    Are you saying that whenever a sufficiently complex machine is programmed 
to avoid
    specific conditions that avoidance conjures an experience of pain out of 
nowhere?

    Pain and pleasure.


Can you explain why that would happen and how it could happen? 1+1 = pain?




        > Like any computer program, a quantitative equivalence which is 
unsentimental and
        > unconscious would always be more effective.

        Unsentimental, maybe.  But not unemotional.  For example, rage is very 
useful
        in defense
        of one's children.


    No it isn't. You are only looking at it retrospectively. The effectiveness 
of rage
    is not in the experience of rage, it is in the boost of strength, endurance,
    aggressive behavior, etc. All of that could be engineered without inventing 
some
    kind of ridiculous 'emotional state' as a theatrical presentation.

    That's what you say.  But what do you think is an emotional state except 
the boost
    in adrenaline, the focus on objective, etc?


I think that an emotional state is a sensory-motor experience in which we participate directly. Adrenaline is a substance, it has no emotional qualities. A dead person's body could be filled with adrenaline and there would be no emotion there.

      You are simply imagining the two can be separated because you have 
different words
    and viewpoints to describe them.


No, I am observing that there are different words for them because they have absolutely nothing in common except a spatiotemporal correlation.


    Look at it prospectively instead. You are trying to make an effective 
replicator.
    Why would you ever need to do anything but optimize its behaviors?

    You wouldn't, but that would entail it having values and emotions.


Values and emotions don't exist yet. That's what I mean by looking at it prospectively. You have to justify the creation of 'values and emotions', but you can't. You can only claim blindness to the obvious difference between a machine acting rapidly and forcefully, and an experience of anger and strength. It may not be your fault. I don't know if I have every come across someone who has the Western orientation who is able to shift their perception. It's a foreground-background shift, which you may not be wired to be able to do, in which case I apologize for expecting you to be able to do that.

And I apologize for expecting you to be able to imagine that implementing intelligence would entail value and emotion.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to