Re: Re: [singularity] Defining the Singularity

2006-10-30 Thread Lúcio de Souza Coelho
On 10/27/06, Matt Mahoney <[EMAIL PROTECTED]> wrote: (...) Orwell's 1984 predicted a world where a totalitarian government watched your every move. What he failed to predict is that it would happen in a democracy. People want surveillence. You want cameras in businesses for better security.

Re: [singularity] Defining the Singularity

2006-10-27 Thread Matt Mahoney
e From: Richard Loosemore <[EMAIL PROTECTED]> To: singularity@v2.listbox.com Sent: Friday, October 27, 2006 10:30:52 AM Subject: Re: [singularity] Defining the Singularity Matt, This is a textbook example of the way that all discussions of the consequences of a singularity tend to go. What

Re: [singularity] Defining the Singularity

2006-10-27 Thread Richard Loosemore
don't cooperate... Vinge describes the singularity as the end of the human era. I think your nervousness is justified. -- Matt Mahoney, [EMAIL PROTECTED] - Original Message From: deering <[EMAIL PROTECTED]> To: singularity@v2.listbox.com Sent: Thursday, October 26, 2006 7:

Re: [singularity] Defining the Singularity

2006-10-26 Thread Kaj Sotala
On 10/27/06, deering <[EMAIL PROTECTED]> wrote: All this talk about trying to make a SAI Friendly makes me very nervous. You're giving a superhumanly powerful being a set of motivations without an underlying rationale. That's a religion. The only rational thing to do is to build an SAI without

Re: [singularity] Defining the Singularity

2006-10-26 Thread Matt Mahoney
ctober 26, 2006 7:56:06 PMSubject: Re: [singularity] Defining the Singularity All this talk about trying to make a SAI Friendly makes me very nervous.  You're giving a superhumanly powerful being a set of motivations without an underlying rationale.  That's a religion.   The only rati

Re: Re: [singularity] Defining the Singularity

2006-10-26 Thread Lúcio de Souza Coelho
On 10/26/06, deering <[EMAIL PROTECTED]> wrote: (...) The only rational thing to do is to build an SAI without any preconceived ideas of right and wrong, and let it figure it out for itself. What makes you think that protecting humanity is the greatest good in the universe? (...) Hundreds of t

Re: [singularity] Defining the Singularity

2006-10-26 Thread Richard Loosemore
Matt Mahoney wrote: - Original Message From: Starglider <[EMAIL PROTECTED]> To: singularity@v2.listbox.com Sent: Thursday, October 26, 2006 4:21:45 AM Subject: Re: [singularity] Defining the Singularity What I'm not sure about is that you gain anything from 'neura

Re: [singularity] Defining the Singularity

2006-10-26 Thread Richard Loosemore
deering wrote: All this talk about trying to make a SAI Friendly makes me very nervous. You're giving a superhumanly powerful being a set of motivations without an underlying rationale. That's a religion. Your comments are a little baffling: what do you mean by giving it motivations withou

Re: [singularity] Defining the Singularity

2006-10-26 Thread deering
All this talk about trying to make a SAI Friendly makes me very nervous.  You're giving a superhumanly powerful being a set of motivations without an underlying rationale.  That's a religion.   The only rational thing to do is to build an SAI without any preconceived ideas of right and wrong

Re: [singularity] Defining the Singularity

2006-10-26 Thread Matt Mahoney
- Original Message From: Starglider <[EMAIL PROTECTED]> To: singularity@v2.listbox.com Sent: Thursday, October 26, 2006 4:21:45 AM Subject: Re: [singularity] Defining the Singularity >What I'm not sure about is that you gain anything from 'neural' or >'

Re: Re: [singularity] Defining the Singularity

2006-10-26 Thread Ben Goertzel
HI, About hybrid/integrative architecturs, Michael Wilson said: I'd agree that it looks good when you first start attacking the problem. Classic ANNs have some demonstrated competencies, classic symbolic AI has some different demonstrated competencies, as do humans and existing non-AI software.

Re: [singularity] Defining the Singularity

2006-10-26 Thread Starglider
Matt Mahoney wrote: >> 'Access to' isn't the same thing as 'augmented with' of course, but I'm >> not sure exactly what you mean by this (and I'd rather wait for you to >> explain than guess). > > I was referring to one possible implementation of AGI consisting of part > neural > or brainlike imp

Re: [singularity] Defining the Singularity

2006-10-25 Thread Matt Mahoney
- Original Message From: Starglider <[EMAIL PROTECTED]> To: singularity@v2.listbox.com Sent: Wednesday, October 25, 2006 2:32:27 PM Subject: Re: [singularity] Defining the Singularity >All AGIs implemented on general purpose computers will have access to >'conve

Re: [singularity] Defining the Singularity

2006-10-25 Thread Starglider
My apologies for the duplication of my previous post; I thought my mail client failed to send the original, but actually it just dropped the echo from the server. Matt Mahoney wrote: > Michael Wilson wrote: >> Hybrid approaches (e.g. what Ben's probably envisioning) are almost certainly >> better

Re: [singularity] Defining the Singularity

2006-10-25 Thread Matt Mahoney
- Original Message From: Starglider <[EMAIL PROTECTED]> To: singularity@v2.listbox.com Sent: Tuesday, October 24, 2006 2:54:47 PM Subject: Re: [singularity] Defining the Singularity >Hybrid approaches (e.g. what Ben's probably envisioning) are almost certainly >bett

Re: [singularity] Defining the Singularity

2006-10-25 Thread Richard Loosemore
Starglider wrote: I have no wish to rehash the fairly futile and extremely disruptive discussion of Loosemore's assertions that occurred on the SL4 mailing list. I am willing to address the implicit questions/assumptions about my own position. You may not have noticed that at the end of my pre

Re: [singularity] Defining the Singularity

2006-10-24 Thread Starglider
I'll try and avoid a repeat of the lenghtly, fairly futile and extremely disruptive discussion of Loosemore's assertions that occurred on the SL4 mailing list. I am willing to address the implicit questions/assumptions about my own position. Richard Loosemore wrote: > The contribution of complex

Re: Re: [singularity] Defining the Singularity

2006-10-24 Thread Ben Goertzel
Loosemore wrote: > The motivational system of some types of AI (the types you would > classify as tainted by complexity) can be made so reliable that the > likelihood of them becoming unfriendly would be similar to the > likelihood of the molecules of an Ideal Gas suddenly deciding to split > int

Re: [singularity] Defining the Singularity

2006-10-24 Thread Starglider
I have no wish to rehash the fairly futile and extremely disruptive discussion of Loosemore's assertions that occurred on the SL4 mailing list. I am willing to address the implicit questions/assumptions about my own position. Richard Loosemore wrote: > The contribution of complex systems science i

Re: [singularity] Defining the Singularity

2006-10-24 Thread Richard Loosemore
Starglider wrote: You know my position on 'complex systems science'; yet to do anything useful, unlikely to ever help in AGI, would create FAI-incompatible systems even if it could. And you know my position is that this is completely wrong. For the sake of those who do not know about this d

Re: [singularity] Defining the Singularity

2006-10-23 Thread Starglider
On 23 Oct 2006 at 13:26, Ben Goertzel wrote: > Whereas, my view is that it is precisely the effective combination of > probabilistic logic with complex systems science (including the notion of > emergence) that will lead to, finally, a coherent and useful theoretical > framework for designing an

Re: [singularity] Defining the Singularity

2006-10-23 Thread Ben Goertzel
Though I have remained often-publiclyopposed to emergence and 'fuzzy' design since first realising what the true consequences (of the heavily enhanced-GA-based system I was workingon at the time) were, as far as I know I haven't made that particularmistake again.Whereas, my view is that it is preci

Re: [singularity] Defining the Singularity

2006-10-23 Thread Starglider
On 23 Oct 2006 at 12:59, Ben Goertzel wrote: >>> Ditto with just about anything else that's at all innovative -- e.g. was >>> Einstein's General Relativity a fundamental new breakthrough, or just a >>> tweak on prior insights by Riemann and Hilbert? >> >> I wonder if this is a sublime form of

Re: [singularity] Defining the Singularity

2006-10-23 Thread Ben Goertzel
Hi, > Ditto with just about anything else that's at all innovative -- e.g. was> Einstein's General Relativity a fundamental new breakthrough, or just a> tweak on prior insights by Riemann and Hilbert?I wonder if this is a sublime form of irony for a horribly naïve and arrogant analogy to GR I drew

Re: [singularity] Defining the Singularity

2006-10-23 Thread Samantha Atkins
On Oct 23, 2006, at 7:39 AM, Ben Goertzel wrote: Michael, I think your summary of the situation is in many respects accurate; but, an interesting aspect you don't mention has to do with the disclosure of technical details... In the case of Novamente, we have sufficient academic credibil

Re: [singularity] Defining the Singularity

2006-10-23 Thread Starglider
On 23 Oct 2006 at 10:39, Ben Goertzel wrote: > In the case of Novamente, we have sufficient academic credibility and know- > how that we could easily publish a raft of journal papers on the details of > Novamente's design and preliminary experimentation. That bumps your objective success probabil

Re: [singularity] Defining the Singularity

2006-10-23 Thread Ben Goertzel
Michael,I think your summary of the situation is in many respects accurate; but, an interesting aspect you don't mention has to do with the disclosure of technical details...In the case of Novamente, we have sufficient academic credibility and know-how that we could easily publish a raft of journal

Re: [singularity] Defining the Singularity

2006-10-23 Thread Ben Goertzel
I think Mark's observation is correct.  Anti-aging is far easier to fund than AGI because there are a lot more people interested in preserving their own lives than in creating AGI  Furthermore, the M-prize money is to fund a **prize**, not directly to fund research on some particular project...

Re: [singularity] Defining the Singularity

2006-10-23 Thread Starglider
On 22 Oct 2006 at 17:22, Samantha Atkins wrote: > It is a lot easier I imagine to find many people willing and able to > donate on the order of $100/month indefinitely to such a cause than to > find one or a few people to put up the entire amount. I am sure that has > already been kicked around.  W

Re: [singularity] Defining the Singularity

2006-10-22 Thread Mark Nuzzolilo II
Well, there is funding like in the Methuselah Mouse project. I am one of "the 300" myself. With enough interested >people it should not be that hard to raise $5 million even on a very long term project. Most of us seem to think that >conquering aging will take longer than AGI but there are

Re: [singularity] Defining the Singularity

2006-10-22 Thread Ben Goertzel
Hi, I know you must be frustrated with fund raising, but investor relunctance is understandable from the perspective that for decadesnow there has always been someone who said we're N years from fullblown AI, and then N years passed with nothing but narrow AI progress.Of course, someone will end up

Re: [singularity] Defining the Singularity

2006-10-22 Thread Ben Goertzel
Japan, despite a lot of interest back in 5th Generation computer days seems to have a difficult time innovating in advanced software.  I am not sure why.   I talked recently, at an academic conference, with the guy who directs robotics research labs within ATR, the primary Japanese government resea

RE: [singularity] Defining the Singularity

2006-10-22 Thread Peter Voss
rmal sense) problem.   Peter   From: Ben Goertzel [mailto:[EMAIL PROTECTED] Sent: Sunday, October 22, 2006 3:36 PM To: singularity@v2.listbox.com Subject: Re: [singularity] Defining the Singularity   Hi,   I know you must be frustrated with fund raising, but investor relunct

Re: [singularity] Defining the Singularity

2006-10-22 Thread Chuck Esterbrook
On 10/22/06, Ben Goertzel <[EMAIL PROTECTED]> wrote: > I know you must be frustrated with fund raising, but investor > relunctance is understandable from the perspective that for decades > now there has always been someone who said we're N years from full > blown AI, and then N years passed with

Re: [singularity] Defining the Singularity

2006-10-22 Thread Chuck Esterbrook
On 10/22/06, Ben Goertzel <[EMAIL PROTECTED]> wrote: This particular potential investor is "still thinking about it" ... he's currently on vacation and will discuss further when he gets back. Of course this was an unusual conversation due to the amputation theme (and the amount of wine being con

Re: [singularity] Defining the Singularity

2006-10-22 Thread Samantha Atkins
On Oct 22, 2006, at 11:32 AM, Ben Goertzel wrote:Hi, Mike Deering wrote: If you really were interested in working on the Singularity you would be designing your education plan around getting a job at the NSA.  The NSA has the budget, the technology, the skill set, and the motivation to build the S

Re: [singularity] Defining the Singularity

2006-10-22 Thread Ben Goertzel
Hi, Mike Deering wrote: If you really were interested in working on the Singularity you would be designing your education plan around getting a job at the NSA.  The NSA has the budget, the technology, the skill set, and the motivation to build the Singularity.  Everyone else, universities, priva

Re: [singularity] Defining the Singularity

2006-10-22 Thread Samantha Atkins
Well, there is funding like in the Methuselah Mouse project.  I am one of "the 300" myself.   With enough interested people it should not be that hard to raise $5 million even on a very long term project.  Most of us seem to think that conquering aging will take longer than AGI but there are fairl

Re: [singularity] Defining the Singularity

2006-10-22 Thread deering
All this talk about proving something before doing it is beside the point.  We, as a species, as a government, as scientists, as individuals, never prove anything before we try it.  We just don't.  Think of the many examples of new stuff we have done.  Have we proved any of them would be saf

Re: [singularity] Defining the Singularity

2006-10-22 Thread Starglider
Samantha Atkins wrote: > Of late I feel a lot of despair because I see lots of brilliant people > seemingly mired in endlessly rehashing what-ifs, arcane philosophical > points and willing to put off actually creating greater than human > intelligence and transhuman tech indefinitely until they can

Re: [singularity] Defining the Singularity

2006-10-21 Thread Samantha Atkins
On Oct 20, 2006, at 2:14 AM, Michael Anissimov wrote:Sometimes, Samantha, it seems like you have little faith in anypossible form of intelligence, and that the only way for one to besafe/happy is to be isolated from everything.  I sometimes get thisimpression from libertarians (not to say that I'm

Re: [singularity] Defining the Singularity

2006-10-21 Thread Russell Wallace
On 10/22/06, Samantha Atkins <[EMAIL PROTECTED]> wrote: I am sorry you seem to have utterly missed the point of my remarks.The world heads straight to destruction and all of us toward true deadwhile we daydream of the wonders a superintelligent being we do notknow how to build could do if it was to

Re: [singularity] Defining the Singularity

2006-10-21 Thread Samantha Atkins
On Oct 20, 2006, at 2:14 AM, Michael Anissimov wrote: Samantha, Considering the state of the world today I don't see how changes sufficient to be really helpful can be anything but disruptive of the status quo. Being non-disruptive per se is a non-goal. Ah, that's what it seems like! But

Re: [singularity] Defining the Singularity

2006-10-21 Thread deering
Michael Anissimov, as usual, we are speaking past each other.  I actually agree with you much more than you give me credit for.  With the exponential pace of technological advancement and the new powers inherent in superhuman intelligence, and molecular engineering, the future can't help but

Re: [singularity] Defining the Singularity

2006-10-20 Thread Michael Anissimov
Samantha, Considering the state of the world today I don't see how changes sufficient to be really helpful can be anything but disruptive of the status quo. Being non-disruptive per se is a non-goal. Ah, that's what it seems like! But I think we'd be surprised at the good a superintelligenc

Re: [singularity] Defining the Singularity

2006-10-18 Thread Samantha Atkins
On Oct 17, 2006, at 2:45 PM, Michael Anissimov wrote: Mike, On 10/10/06, deering <[EMAIL PROTECTED]> wrote: Going beyond the definition of Singularity we can make some educated guesses about the most likely conditions under which the Singularity will occur. Due to technological synergy,

Re: [singularity] Defining the Singularity

2006-10-17 Thread Michael Anissimov
Mike, On 10/10/06, deering <[EMAIL PROTECTED]> wrote: Going beyond the definition of Singularity we can make some educated guesses about the most likely conditions under which the Singularity will occur. Due to technological synergy, the creation of STHI will happen coincident with the achievem

RE: [singularity] Defining the Singularity

2006-10-11 Thread Bruce LaDuke
Singularity is a word that someone made up with a definition/meaning that someone advanced from an existing knowledge context and somewhere along the way society accepted some definition and that term itself. You could just as well call it zoombalabala if society would have accepted it (there a

Re: [singularity] Defining the Singularity

2006-10-10 Thread deering
The word 'Singularity' in the futurism context, rather than the mathematical or science context, is a label for something in the real world.  Something that someone noticed and said, "Hey, we need a word for this!"  That someone was Verner Vinge and what he noticed was that technology, due i

Re: [singularity] Defining the Singularity

2006-10-10 Thread Samantha  Atkins
hmm.  Someone will please give me a gentle nudge when something is discussed here of actual import to achieving singularity. In the meantime think I will take a siesta.- samanthaOn Oct 10, 2006, at 1:01 PM, Richard Leis wrote:The general consensus also depends on the context for which it is being u

Re: [singularity] Defining the Singularity

2006-10-10 Thread Bruce LaDuke
-Original Message Follows From: "Ben Goertzel" <[EMAIL PROTECTED]> Reply-To: singularity@v2.listbox.com To: singularity@v2.listbox.com Subject: Re: [singularity] Defining the Singularity Date: Tue, 10 Oct 2006 13:24:17 -0400 Indeed... What we are running into here is simply the pove

Re: [singularity] Defining the Singularity

2006-10-10 Thread Richard Leis
The general consensus also depends on the context for which it is being used.  When discussing the Singularity among AGI professionals, "smarter-than-human intelligence" can probably be assumed.  Perhaps it would be more useful for the commentator to describe which "Singularity" they are discussing

Re: [singularity] Defining the Singularity

2006-10-10 Thread Nathan Barna
Michael makes a good point that it's intellectually permissible to argue ad nauseam over side claims but that it's still important to have a general consensus on an explicit description of the very idea that would allow almost every literate person to elicit the concept of the Singularity in the f

Re: [singularity] Defining the Singularity

2006-10-10 Thread Lúcio de Souza Coelho
On 10/10/06, Hank Conn <[EMAIL PROTECTED]> wrote: (...) My problem with Michael's original definition was the statement about producing a genetically engineered child that was smarter-than-human, and allowing that to be defined as the Singularity. I think in order for a point in this recursive se

Re: [singularity] Defining the Singularity

2006-10-10 Thread Hank Conn
On 10/10/06, Ben Goertzel <[EMAIL PROTECTED]> wrote: Hank,On 10/10/06, Hank Conn <[EMAIL PROTECTED] > wrote:> The all-encompassing definition of the Singularity is the point at which an> intelligence gains the ability to recursively self-improve the underlying> computational processes of its intell

Re: [singularity] Defining the Singularity

2006-10-10 Thread Ben Goertzel
On the other hand (to add a little levity to the conversation), a very avid 2012-ite I knew last year informed me that "You should just mix eight ounces of Robitussin with eight ounces of vodka and drink it fast -- you'll find your own private Singularity, right there!!" ;-pp On 10/10/06, Lúci

Re: [singularity] Defining the Singularity

2006-10-10 Thread Lúcio de Souza Coelho
On 10/10/06, BillK <[EMAIL PROTECTED]> wrote: (...) If next year a quad-core pc becomes a self-improving AI in a basement in Atlanta, then disappears a hour later into another dimension, then so far as the rest of the world is concerned, the Singularity never happened. (...) Yep, I also tend to

Re: [singularity] Defining the Singularity

2006-10-10 Thread Ben Goertzel
Indeed... What we are running into here is simply the poverty of compact formal definitions. AI researchers long ago figured out that it's difficult to create a compact formal definition of "chair" or "arch" or table... Ditto for "Singularity", not surprisingly... This doesn't mean compact def

Re: [singularity] Defining the Singularity

2006-10-10 Thread BillK
On 10/10/06, Ben Goertzel wrote: But from the perspective of deeper understanding, I don't see why it's critical to agree on a single definition, or that there be a compact and crisp definition. It's a complex world and these are complex phenomena we're talking about, as yet dimly understood.

Re: [singularity] Defining the Singularity

2006-10-10 Thread Ben Goertzel
Hank, On 10/10/06, Hank Conn <[EMAIL PROTECTED]> wrote: The all-encompassing definition of the Singularity is the point at which an intelligence gains the ability to recursively self-improve the underlying computational processes of its intelligence. I already have that ability -- I'm just ver

Re: [singularity] Defining the Singularity

2006-10-10 Thread Hank Conn
"A single genetically engineered child born with a substantiallysmarter-than-human IQ would constitute a Singularity"   That is a flaw in your definition.   The all-encompassing definition of the Singularity is the point at which an intelligence gains the ability to recursively self-improve the und

Re: [singularity] Defining the Singularity

2006-10-10 Thread Ben Goertzel
Hi, The reason that so many in the intellectual community see Singularity discussion as garbage is because there is so little definitional consensus that it's close to impossible to determine what's actually being discussed. I doubt this... I think the reason that Singularity discussion is di

[singularity] Defining the Singularity

2006-10-10 Thread Michael Anissimov
The Singularity definitions being presented here are incredibly confusing and contradictory. If I were a newcomer to the community and saw this thread, I'd say that this word "Singularity" is so poorly defined, it's useless. Everyone is talking past each other. As Nick Hay has pointed out, the