Matt: But perhaps you
find the idea of reducing the human mind to computation offensive?

Yes, I do. There are the human implications I started to point out. Far more important here are the mechanistic ones - it is indeed a massive reduction with respect to emotions of a complex systemic operation. These AI simulations of emotion leave out so much, (such as self, body and actual emotions - but hey, who cares about *those*?), and the bits they do copy - the valuation of behaviour conducted by the brain - they get fundamentally wrong. You have emotions about things that you CAN'T value other than extremely crudely, analogically and that are strictly, non-comparable. That's what the whole system is designed for. What are the mathematical values expressed when you have conflicting emotions about whether to masturbate or do your work? Masturbate = how many units of utility? Work = how many units?

How much do you like marshmallow ice cream, and how much creme caramel? Well about this much and about that much, (can you see how wide my arms are stretched each time)? Well, maybe it's about this much and that much (changing my arm width again).

As always, AI gets the development and evolution of mind completely the wrong way round. The emotional system we and animals have is primary. It's a crude, non-numerical value system, among other things, which works v. well, considering. Putting numbers to it is an approach that is an evolutionarily extremely belated afterthought, and an occasional help, but also a hindrance.

Anyway, you don't seem to be contesting - what you & other AI-ers are doing is adding a value system to your computers, not an emotional system.

P.S. Emotions are designed to evaluate a complex psychoeconomy of ctivities - the many, v. different and strictly non-comparable activities that every animal and human being engage in - work, hunting, foraging, eating, sex, grooming, cuddling, sightseeing etc. To some extent you *can* compare emotions to a currency, to some extent you *can't* - because what are often at stake are two fundamentally different kinds or vast categories of emotion about two fundamentally different kinds of activity. The positive emotions you get from activities like reading mags, watching TV, eating ice cream etc. are of a fundamentally different kind to those you get from work, exercise, thinking about AI and other active activities - because the activities are fundamentally,l physically different - passive consumption vs active production. And just to complicate matters, the emotions you get from sex are a mixture of both kinds. (A lot of this is to do with our divided/ conflicted autonomic nervous system - the basis of emotions). Got all that in your AGI?

P.P.S. Ben: As far as I can see, these comments apply to your AGI approach to emotions re your post - you too seem to be talking about a pure value system in practice rather than a true emotional system - but I may have misread.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=75617389-19152d

Reply via email to