Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Vladimir Nesov
On Jan 27, 2008 5:32 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > Software correctness is undecidable -- the halting problem reduces to it. > Computer security isn't going to be magically solved by AGI. The problem will > actually get worse, because complex systems are harder to get right. > C

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Richard Loosemore
Matt Mahoney wrote: If I hired you as a security analyst to find flaws in a piece of software, and I didn't tell you what I was going to do with the information, how would you know? This is so silly it is actually getting quite amusing... :-) So, you are positing a situation in which I am an

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Randall Randall
I pulled in some extra context from earlier messages to illustrate an interesting event, here. On Jan 27, 2008, at 12:24 PM, Richard Loosemore wrote: --- Richard Loosemore <[EMAIL PROTECTED]> wrote: Matt Mahoney wrote: Suppose you ask the AGI to examine some operating system or server softwar

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Matt Mahoney
--- Vladimir Nesov <[EMAIL PROTECTED]> wrote: > On Jan 27, 2008 5:32 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > > Software correctness is undecidable -- the halting problem reduces to it. > > Computer security isn't going to be magically solved by AGI. The problem > will > > actually get w

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Vladimir Nesov
On Jan 28, 2008 1:15 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > --- Vladimir Nesov <[EMAIL PROTECTED]> wrote: > > > On Jan 27, 2008 5:32 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > > > > Software correctness is undecidable -- the halting problem reduces to it. > > > Computer security isn't

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread William Pearson
On 27/01/2008, Matt Mahoney <[EMAIL PROTECTED]> wrote: > --- Vladimir Nesov <[EMAIL PROTECTED]> wrote: > > > On Jan 27, 2008 5:32 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > > > > Software correctness is undecidable -- the halting problem reduces to it. > > > Computer security isn't going to

Re: [agi] MindForth achieves True AI functionality

2008-01-27 Thread Stephen Reed
Richard, thanks for the book tip on Eysenck & Keane. I just ordered from Amazon. Cheers. -Steve Stephen L. Reed Artificial Intelligence Researcher http://texai.org/blog http://texai.org 3008 Oak Crest Ave. Austin, Texas, USA 78704 512.791.7860 - Original Message From: Richard Loosemo

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Matt Mahoney
--- Vladimir Nesov <[EMAIL PROTECTED]> wrote: > You don't NEED intrusion detection if intrusion cannot be done. If > your software doesn't read anything from outside, it's not possible to > attack it. If it reads that data and correctly does nothing with it, > it's not possible to attack it. If it

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Vladimir Nesov
On Jan 28, 2008 4:53 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > --- Vladimir Nesov <[EMAIL PROTECTED]> wrote: > > You don't NEED intrusion detection if intrusion cannot be done. If > > your software doesn't read anything from outside, it's not possible to > > attack it. If it reads that data and

Re: Singularity Outcomes [WAS Re: [agi] OpenMind, MindPixel founders both commit suicide

2008-01-27 Thread Ben Goertzel
> Google > already knows more than any human, This is only true, of course, for specific interpretations of the word "know" ... and NOT for the standard ones... >and can retrieve the information faster, > but it can't launch a singularity. Because, among other reasons, it is not an intelligence