Re: [agi] The Singularity Forum

2018-06-16 Thread Matt Mahoney via AGI
It's just math. https://en.wikipedia.org/wiki/Central_limit_theorem When you add random variables, you add their means and variances and the sum tends to a Gaussian curve. When you multiply instead of add, the same thing happens when you take the log of the distributions. On Fri, Jun 15, 2018 at

Re: [agi] The Singularity Forum

2018-06-15 Thread Matt Mahoney via AGI
On Thu, Jun 14, 2018 at 8:04 PM Mark Nuzz via AGI wrote: > > The Singularity analogy was never intended to imply infinite power. Rather it > represents a point at which understanding and predictability breaks down and > becomes impossible. Agreed. Vinge called it an "event horizon" on our

Re: [agi] The Singularity Forum

2018-06-15 Thread Matt Mahoney via AGI
On Thu, Jun 14, 2018 at 10:40 PM Steve Richfield via AGI wrote: > > In the space of real world "problems", I suspect the distribution of > difficulty follows the Zipf function, like pretty much everything else does. A Zipf distribution is a power law distribution. The reason that power law

Re: [agi] The Singularity Forum

2018-06-15 Thread johnrose
The patent affirms what I was saying - the app/server sees others in the same movie theater have dimmed their screen so it dims it for that user. Not AGI... just a db query add-on to a location service.. "As another example, a Service node may reference an application that controls user device

Re: [agi] The Singularity Forum

2018-06-15 Thread johnrose
Kimera - I just looked at this a little and translating - they have a working "AGI" that can do AI currently and on the roadmap is real AGI but need more funding for marketing, partnerships and development. Apparently about 80% non-engineers on their "team/advisers". The ICO whitepaper page 18

Re: [agi] The Singularity Forum

2018-06-14 Thread Steve Richfield via AGI
In the space of real world "problems", I suspect the distribution of difficulty follows the Zipf function, like pretty much everything else does. The curious thing about the Zipf function is the structure of its extreme tail - it is finite, it drops off fast, and it doesn't encompass much of the

Re: [agi] The Singularity Forum

2018-06-14 Thread Mark Nuzz via AGI
The Singularity analogy was never intended to imply infinite power. Rather it represents a point at which understanding and predictability breaks down and becomes impossible. On Jun 14, 2018 3:59 PM, "Matt Mahoney via AGI" wrote: > Vinge: when humans produce superhuman AI then so can it, only

Re: [agi] The Singularity Forum

2018-06-14 Thread Matt Mahoney via AGI
Vinge: when humans produce superhuman AI then so can it, only faster. A singularity in mathematics is a point where a function (like intelligence over time) goes to infinity. That can't happen in a universe with finite computing power and finite memory. Or by singularity do you mean when AI makes

Re: [agi] The Singularity Forum

2018-06-14 Thread Steve Richfield via AGI
Matt, My own view is that a human-based singularity is MUCH closer. The problem is NOT a shortage of GFLOPS or suitable software, but rather, a repairable problem in our wetware. Sure, a silicon solution might eventually be faster, but why simply wait until then? Apparently, I failed to

Re: [agi] The Singularity Forum

2018-06-14 Thread Matt Mahoney via AGI
The singularity list (and SL4) died years ago. The singularity has been 30 years away for decades now. I guess we got tired of talking about it. -- Artificial General Intelligence List: AGI Permalink:

Re: [agi] The Singularity Forum

2018-06-14 Thread MP via AGI
They’ve done demos for Intel in the past IIRC. But the secrecy (and yes, I’m aware, as irritating as it is) resides in that they haven’t patented it yet, and are afraid of secrets being stolen. BUT I can show you the high level architecture and tell you guys now the core system is basically a

Re: [agi] The Singularity Forum

2018-06-14 Thread Ben Goertzel
Kimera ... I mean they seem like smart people but the rhetoric associated with the project is sufficiently overblown to make me not want to pay attention... On Thu, Jun 14, 2018 at 3:10 PM, MP via AGI wrote: > Speaking of which, anyone here heard of Kimera Systems and their so-called > AGI

Re: [agi] The Singularity Forum

2018-06-14 Thread MP via AGI
Speaking of which, anyone here heard of Kimera Systems and their so-called AGI Nigel? It seems they’re touting a blockchained powered causal inference engine as this all-encompassing intelligence system. I’m still on the fence. I’m pretty "in" with the company as it is, but even the CEO Mounir

Re: [agi] The Singularity Forum

2018-06-13 Thread MP via AGI
That’s a sad scenario, Ben : Sent from ProtonMail Mobile On Wed, Jun 13, 2018 at 10:01 PM, Ben Goertzel wrote: > I guess whomever was paying the bills for that list (KurzweilAI?) got bored > and stopped paying it...? On Thu, Jun 14, 2018 at 6:12 AM, Steve Richfield > via AGI wrote: > I tried

[agi] The Singularity Forum

2018-06-13 Thread Steve Richfield via AGI
I tried posting on the Singularity forum, but it bounced. What is the story here? Steve -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T5ada390c367596a4-Mcb29e0ed3d1c9db66a1af505 Delivery options: