Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Eric Burton
I kind of feel this way too. It should be easy to get neural nets embedded in VR to achieve the intelligence of say magpies, or finches. But the same approaches you might use, top-down ones, may not scale to human level. Given a 100x increase in workstation capacity I don't see why we can't start

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Terren Suydam
comments below... --- On Sat, 8/23/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > The last post by Eliezer provides handy imagery for this > point ( > http://www.overcomingbias.com/2008/08/mirrors-and-pai.html > ). You > can't have an AI of perfect emptiness, without any > goals at all, > becaus

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Mike Tintner
Terren:> Just wanted to add something, to bring it back to feasibility of embodied/unembodied approaches. Using the definition of embodiment I described, it needs to be said that it is impossible to specify the goals of the agent, because in so doing, you'd be passing it information in an unemb

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Vladimir Nesov
On Sat, Aug 23, 2008 at 11:38 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > Just wanted to add something, to bring it back to feasibility of > embodied/unembodied approaches. Using the definition of embodiment > I described, it needs to be said that it is impossible to specify the goals > of the

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Terren Suydam
Just wanted to add something, to bring it back to feasibility of embodied/unembodied approaches. Using the definition of embodiment I described, it needs to be said that it is impossible to specify the goals of the agent, because in so doing, you'd be passing it information in an unembodied way

Re: [agi] I Made a Mistake

2008-08-23 Thread Terren Suydam
No worries, that's why I heartily advocate doing exactly what you did, but not sending it. It's a lesson I've learned the hard way more times than I care to admit. --- On Sat, 8/23/08, Eric Burton <[EMAIL PROTECTED]> wrote: > Thanks Terren, I shouldn't have got angry so fast. One > thing I wor

Re: [agi] I Made a Mistake

2008-08-23 Thread Eric Burton
Thanks Terren, I shouldn't have got angry so fast. One thing I worry about constantly when going places or discussing anything is the quality of discourse. On 8/23/08, Terren Suydam <[EMAIL PROTECTED]> wrote: > > Eric, > > You lower the quality of this list with comments like that. It's the kind o

Re: [agi] I Made a Mistake

2008-08-23 Thread Terren Suydam
Eric, You lower the quality of this list with comments like that. It's the kind of thing that got people wondering a month ago whether moderation is necessary on this list. If we're all adults, moderation shouldn't be necessary. Jim, do us all a favor and don't respond to that, as tempting as

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Terren Suydam
Yeah, that's where the misunderstanding is... "low level input" is too fuzzy a concept. I don't know if this is the accepted mainstream definition of embodiment, but this is how I see it. The thing that distinguishes an embodied agent from an unembodied one is whether the agent is given pre-st

Re: [agi] I Made a Mistake

2008-08-23 Thread Eric Burton
Stupid fundamentalist troll garbage On 8/22/08, Jim Bromer <[EMAIL PROTECTED]> wrote: > I just discovered that I made a very obvious blunder on my theory > about Logical Satisfiability last November. It was a, "what was I > thinking," kind of error. No sooner did I discover this error a > couple

Re: Information theoretic approaches to AGI (was Re: [agi] The Necessity of Embodiment)

2008-08-23 Thread Eric Burton
>These have profound impacts on AGI design. First, AIXI is (provably) not >computable, >which means there is no easy shortcut to AGI. Second, universal intelligence >is not >computable because it requires testing in an infinite number of environments. >Since >there is no other well accepted test

Re: Information theoretic approaches to AGI (was Re: [agi] The Necessity of Embodiment)

2008-08-23 Thread Jim Bromer
On Sat, Aug 23, 2008 at 7:00 AM, William Pearson <[EMAIL PROTECTED]> wrote: > 2008/8/23 Matt Mahoney <[EMAIL PROTECTED]>: >> Valentina Poletti <[EMAIL PROTECTED]> wrote: >>> I was wondering why no-one had brought up the information-theoretic aspect >>> of this yet. >> >> It has been studied. For e

Re: Information theoretic approaches to AGI (was Re: [agi] The Necessity of Embodiment)

2008-08-23 Thread William Pearson
2008/8/23 Matt Mahoney <[EMAIL PROTECTED]>: > Valentina Poletti <[EMAIL PROTECTED]> wrote: >> I was wondering why no-one had brought up the information-theoretic aspect >> of this yet. > > It has been studied. For example, Hutter proved that the optimal strategy of > a rational goal seeking agent