Re: [agi] SAT, SMT and AGI

2008-01-21 Thread Vladimir Nesov
On Jan 21, 2008 6:17 AM, Ben Goertzel [EMAIL PROTECTED] wrote: So, people do have a practically useful way of cheating problems in NP now. Problem with AGI is, we don't know how to program it even given computers with infinite computational power. Well, that is wrong IMO AIXI and the

Re: [agi] SAT, SMT and AGI

2008-01-21 Thread Pei Wang
On Jan 20, 2008 10:17 PM, Ben Goertzel [EMAIL PROTECTED] wrote: So, people do have a practically useful way of cheating problems in NP now. Problem with AGI is, we don't know how to program it even given computers with infinite computational power. Well, that is wrong IMO AIXI and the

Re: [agi] SAT, SMT and AGI

2008-01-21 Thread Lukasz Kaiser
SMT in particular seems to have deep potential applicability. To add a little background, SMT is under heavy developement at Microsoft and it is planned to be applied a lot. The basic progress is that the new version of Z3 is an order of magnitude faster than the last one, and is even a

Re: [agi] SAT, SMT and AGI

2008-01-21 Thread Ben Goertzel
As far as I know there is little or no work done yet to integrate probabilistic reasoning with these solvers and it will probably not be easy to do it and keep things efficient. I don't think it will be easy, but what's intriguing is that it seems like it might be feasible-though-difficult

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-21 Thread Matt Mahoney
--- Samantha Atkins [EMAIL PROTECTED] wrote: In http://www.mattmahoney.net/singularity.html I discuss how a singularity will end the human race, but without judgment whether this is good or bad. Any such judgment is based on emotion. Really? I can think of arguments why this

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-21 Thread Mark Waser
For example, hunger is an emotion, but the desire for money to buy food is not Hunger is a sensation, not an emotion. The sensation is unpleasant and you have a hard-coded goal to get rid of it. Further, desires tread pretty close to the line of emotions if not actually crossing over . . . .

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-21 Thread Richard Loosemore
Matt Mahoney wrote: --- Samantha Atkins [EMAIL PROTECTED] wrote: In http://www.mattmahoney.net/singularity.html I discuss how a singularity will end the human race, but without judgment whether this is good or bad. Any such judgment is based on emotion. Really? I can think of arguments why

Re: [agi] SAT, SMT and AGI

2008-01-21 Thread Pei Wang
If I know you are against X, while X is not one of the s_i, but some general description of it, how can you use the formula? If the knowledge in a data compressor is all at the level of letter string, how can it use the knowledge about the theme of a paper to compress it better? Pei For

[agi] SAT, SMT and AGI

2008-01-21 Thread Jim Bromer
On Jan 20, 2008 2:34 PM, Jim Bromer [EMAIL PROTECTED] wrote: I am disappointed because the question of how a polynomial time solution of logical satisfiability might affect agi is very important to me. Ben Wrote: Well, feel free to start a new thread on that topic, then ;-) In fact, I will do

Re: [agi] SAT, SMT and AGI

2008-01-21 Thread Matt Mahoney
--- Pei Wang [EMAIL PROTECTED] wrote: If I know you are against X, while X is not one of the s_i, but some general description of it, how can you use the formula? If you were compressing the message on topic X I {agree|disagree} and you are predicting bit 2 (after compressing bits 7 through 3

Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-21 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt, This usage of emotion is idiosyncratic and causes endless confusion. You're right. I didn't mean for the discussion to devolve into a disagreement over definitions. As for your larger point, I continue to vehemently disagree with your