Re: [agi] The Function of Emotions is Torture

2007-12-12 Thread Mike Tintner
Let me extend half an apology if I overstepped. I say "half" because my criticism was not really so much personal, as against a whole approach - of many AI-ers generally - which really is extraordinarily callow. There is a simply vast amount of human misery and suffering, which centres on people

Re[2]: [agi] CyberLover passing Turing Test

2007-12-12 Thread Dennis Gorelik
Bryan, >> In my taste, testing with clueless judges is more appropriate >> approach. It makes test less biased. > How can they judge when they don't know what they are judging? Surely, > when they hang out for some cyberlovin', they are not scanning for > intelligence. Our mostly in-bred stupidi

Re: [agi] The Function of Emotions is Torture

2007-12-12 Thread Mike Dougherty
On Dec 12, 2007 9:27 PM, Mike Tintner <[EMAIL PROTECTED]> wrote: > It also shows a very limited understanding of emotions. What do you hope to convey by making comments like this? I often wonder how arrogance and belittling others for their opinions has ever made a positive contribution to a crea

Re: [agi] The Function of Emotions is Torture

2007-12-12 Thread Benjamin Goertzel
Mike In case you're curious I wrote down my theory of emotions here http://www.goertzel.org/dynapsyc/2004/Emotions.htm (an early version of text that later became a chapter in The Hidden Pattern) Among the conclusions my theory of emotions leads to are, as stated there: * * AI systems

Re: [agi] The Function of Emotions is Torture

2007-12-12 Thread Mike Tintner
I don't think you've answered my point - which perhaps wasn't put well enough. All you propose, as far as I can see, is to apply *values* to behaviour - to apply positive and negative figures to behaviours considered beneficial or detrimental, and thus affect the system's further behaviour - r

Re: [agi] The Function of Emotions is Torture

2007-12-12 Thread Matt Mahoney
--- Mike Tintner <[EMAIL PROTECTED]> wrote: > Matt: I don't believe that the ability to feel pleasure and pain depends on > > consciousness. That is just a circular definition. > > http://en.wikipedia.org/wiki/Philosophical_zombie > > Richard:It is not circular. Consciousness and pleasure/pai

Re: Distributed search (was RE: Hacker intelligence level [WAS Re: [agi] Funding AGI research])

2007-12-12 Thread Mike Dougherty
On 12/12/07, James Ratcliff <[EMAIL PROTECTED]> wrote: > This would allow a large amount of knowledge to be extracted in a > distributed manner, keeping track of the quality of information gathered > from each person as a trust metric, and many facts would be gathered and > checked for truth. >

Re: Hacker intelligence level [WAS Re: [agi] Funding AGI research]

2007-12-12 Thread Vladimir Nesov
On Dec 13, 2007 12:09 AM, James Ratcliff <[EMAIL PROTECTED]> wrote: > Mainly as a primer ontology / knowledge representation data set for an AGI > to work with. > Having a number of facts known without having to be typed in about many > frames and connections between frames gives an AGI a good

Re: Re[2]: [agi] CyberLover passing Turing Test

2007-12-12 Thread Matt Mahoney
--- Dennis Gorelik <[EMAIL PROTECTED]> wrote: > Bryan, > > >> If CyberLover works as described, it will qualify as one of the first > >> computer programs ever written that is actually passing the Turing > >> Test. > > > I thought the Turing Test involved fooling/convincing judges, not > > clue

RE: Distributed search (was RE: Hacker intelligence level [WAS Re: [agi] Funding AGI research])

2007-12-12 Thread James Ratcliff
I had been thinking about something along these lines, though not worded as you have in this message yet. What I would be most interested in at this point is a knowledge gathering system somewhere along these lines, where the main AGI could be centralized/clustered or distributed, but where que

Re: Hacker intelligence level [WAS Re: [agi] Funding AGI research]

2007-12-12 Thread James Ratcliff
Mainly as a primer ontology / knowledge representation data set for an AGI to work with. Having a number of facts known without having to be typed in about many frames and connections between frames gives an AGI a good booster to start with. Taken a simple set of common words in a house ch

[agi] The Function of Emotions is Torture

2007-12-12 Thread Mike Tintner
Matt: I don't believe that the ability to feel pleasure and pain depends on consciousness. That is just a circular definition. http://en.wikipedia.org/wiki/Philosophical_zombie Richard:It is not circular. Consciousness and pleasure/pain are both subjective issues. They can resolved togethe

Re: An information theoretic measure of reinforcement (was RE: [agi] AGI and Deity)

2007-12-12 Thread Richard Loosemore
Matt Mahoney wrote: --- Richard Loosemore <[EMAIL PROTECTED]> wrote: Matt Mahoney wrote: --- Richard Loosemore <[EMAIL PROTECTED]> wrote: I have to say that this is only one interpretation of what it would mean for an AGI to experience something, and I for one believe it has no validity at a

RE: [agi] AGI and Deity

2007-12-12 Thread James Ratcliff
Whether it conceives of a god learning by itself is really a moot point, as it will be interacting learning and living in a human world, so it WILL be exposed to all manner of religions and beliefs... What it makes of "faith" and the thoughts of God at that point will be interesting. Another di

[agi] Article on current scientific uses of P2P computing

2007-12-12 Thread Ed Porter
The following article is relevant to much of the prior P2P discussion on this list. http://www.economist.com/printedition/displaystory.cfm?story_id=10202635 . It is a discussion of many of the ways P2P computing is being used for scientific research. Ed Porter - This list is spon

Re: [agi] CyberLover passing Turing Test

2007-12-12 Thread Bryan Bishop
On Wednesday 12 December 2007, Dennis Gorelik wrote: > In my taste, testing with clueless judges is more appropriate > approach. It makes test less biased. How can they judge when they don't know what they are judging? Surely, when they hang out for some cyberlovin', they are not scanning for in

Re: [agi] CyberLover passing Turing Test

2007-12-12 Thread Vladimir Nesov
By this standard ELIZA passed Turing test 40 years ago. On Dec 12, 2007 4:47 AM, Dennis Gorelik <[EMAIL PROTECTED]> wrote: > http://blog.pmarca.com/2007/12/checking-in-on.html > === > If CyberLover works as described, it will qualify as one of the first > computer programs ever written that is act