[agi] The Necessity of Embodiment

2008-08-08 Thread Mike Tintner
Bob: > As a roboticist I can say that a physical body resembling that of a human isn't really all that important. You can build the most sophisticated humanoid possible, but the problems still boil down to how such a machine should be intelligently directed by its software. What embodiment does

Re: [agi] The Necessity of Embodiment

2008-08-08 Thread Bob Mottram
2008/8/8 Mike Tintner <[EMAIL PROTECTED]>: > Now my v. garbled understanding (& please comment) is that those Carnegie > Mellon starfish robots show that such an integrated whole self is both > possible - and perhaps vital - for robots too. Yes I agree with the idea of understanding others throug

[agi] The Necessity of Embodiment..P.S.

2008-08-08 Thread Mike Tintner
The other thing about an embodied perspective is that I have a hunch it may be necessary to understand even inanimate matter from a scientific and not just a personal perspective. (I mention this in case there are any physicists/chemists around who care to comment). IOW we won't understand how

Re: [agi] The Necessity of Embodiment

2008-08-08 Thread Ben Goertzel
I agree w/ Motters ... mirroring is an abstract dynamical process, not a specific biological mechanism ... and is not even specifically tied to embodiment, although the two do work naturally together... I think that embodied (physically or virtually) AI's with in-built bias toward mirroring are th

Re: [agi] The Necessity of Embodiment

2008-08-08 Thread Mike Tintner
Ben : I agree w/ Motters ... mirroring is an abstract dynamical process, not a specific biological mechanism ... and is not even specifically tied to embodiment, although the two do work naturally together... Abstracted.. from what? And by what? (Think about it). Or do you think there re

Re: [agi] The Necessity of Embodiment

2008-08-08 Thread Ben Goertzel
Mike, that all depends on what the meaning of "is" is ... ;-) On Fri, Aug 8, 2008 at 10:27 AM, Mike Tintner <[EMAIL PROTECTED]>wrote: > > > Ben : I agree w/ Motters ... mirroring is an abstract dynamical process, > not a specific biological mechanism ... and is not even specifically tied to > emb

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Brad Paulsen
Mike Tintner wrote: Bob: > As a roboticist I can say that a physical body resembling that of a human isn't really all that important. You can build the most sophisticated humanoid possible, but the problems still boil down to how such a machine should be intelligently directed by its software

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Mike Tintner
Brad: Sigh. Your point of view is heavily biased by the unspoken assumption that AGI must be Turing-indistinguishable from humans. That it must be AGHI. Brad, Literally: "what on earth are you talking about?" What other than human intelligence - symbol & sign-using intelligence - is there?

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Ben Goertzel
On Sat, Aug 9, 2008 at 7:35 AM, Mike Tintner <[EMAIL PROTECTED]>wrote: > Brad: > Sigh. Your point of view is heavily biased by the unspoken assumption that > AGI > must be Turing-indistinguishable from humans. That it must be AGHI. > > Brad, > > Literally: "what on earth are you talking about?"

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Mike Tintner
Ben, I clearly understood/understand this. My point is: are you guys' notions of non-human intelligence anything more than sci-fi fantasy as opposed to serious invention? To be the latter, you must have some half-concrete ideas - however skimpy - of what such intelligence might entail and be d

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Ben Goertzel
On Sat, Aug 9, 2008 at 9:30 AM, Mike Tintner <[EMAIL PROTECTED]>wrote: > Ben, > > I clearly understood/understand this. My point is: are you guys' notions > of non-human intelligence anything more than sci-fi fantasy as opposed to > serious invention? To be the latter, you must have some half-co

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Charles Hixson
Brad Paulsen wrote: ... Sigh. Your point of view is heavily biased by the unspoken assumption that AGI must be Turing-indistinguishable from humans. That it must be AGHI. This is not necessarily a bad idea, it's just the wrong idea given our (lack of) understanding of general intelligence.

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Mike Tintner
Ben, I expressed myself badly. Clearly AGI-ers have ideas & systems, like you, for AGI. But, I suggest, if you examine them, these are all actually humanoid - clear adaptations of human intelligence. Nothing wrong with that. It's just that AGI-ers often *talk* as if they are developing, or cou

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Eric Burton
>But, I suggest, if you examine them, these are all actually humanoid - clear >adaptations >of human intelligence. Nothing wrong with that. It's just that AGI-ers often >*talk* as if >they are developing, or could develop, a truly non-human intelligence - a >brain that >could think in *fundament

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Ben Goertzel
Mike, This is partly just a matter of quibbling over word usage. OpenCogPrime has many commonalities and many differences from the human brain-mind. Among the key differences -- OCP has a coherent, top-down goal system -- OCP uses probabilistic inference at a foundational level, rather than ha

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Mike Tintner
The real big deal is: how would a non-human brain/robot *think* differently? It's easy to envisage radically different substrates. Here,for example, is a pure sci-fi form of radically different thinking. Imagine a creation that could not just entertain images of things, but could physically be

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Ben Goertzel
yes, that kind of mind would surely think very differently than a human, just as would a mind that knew only mathematics ... but, from a practical perspective, it seems more useful to think about minds that are rougly similar to human minds, yet better adapted to existing computer hardware, and la

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Eric Burton
Yes. An electronic mind need never forget important facts. It'd enjoy instant recall and on-demand instantaneous binary-precision arithmetic and all the other upshots of the substrate. On the other hand it couldn't take, say, morphine! --- agi Archives: htt

Re: [agi] The Necessity of Embodiment

2008-08-09 Thread Eric Burton
It's me again, electronic minds of any architecture would also have superior extensibility and open-endedness compared to biological ones. The behaviours embarked on by such a mind could be incomprehensible to the humans its mind was modelled on. I'm sure I'm right about this. On 8/9/08, Eric Bur

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Brad Paulsen
Charles, I don't think I've misunderstood what Turing was proposing. At least not any more than the thousands of other people who have written about Turing and his test over the decades: http://en.wikipedia.org/wiki/Turing_test http://www.zompist.com/turing.html (Twelve reasons to toss the T

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Eric: Yes. An electronic mind need never forget important facts. It'd enjoy instant recall and on-demand instantaneous binary-precision arithmetic and all the other upshots of the substrate. On the other hand it couldn't take, say, morphine! It would though, presumably, have major problems

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Ben:but, from a practical perspective, it seems more useful to think about minds that are rougly similar to human minds, yet better adapted to existing computer hardware, and lacking humans' most severe ethical and motivational flaws Well a) I think that we now agree that you are engaged in a b

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
On Sun, Aug 10, 2008 at 9:02 AM, Mike Tintner <[EMAIL PROTECTED]>wrote: > Ben:but, from a practical perspective, it seems more useful to think > about minds that are rougly similar to human minds, yet better adapted to > existing computer hardware, and lacking humans' most severe ethical and > mo

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Ben, Obviously an argument too massive to be worth pursuing in detail. But just one point - your arguments are essentially specialist focussing on isolated anatomical rather than cognitive features, (and presumably we (science) don't yet have the general, systemic overview necessary to apprecia

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
Agree that the human mind/brain has evolved to work reasonably effectively in a holistic way, in spite of the obviously limitations of various of its components... To give a more cognitive example of a needless limitation of the human mind: why can't we just remember a few hundred numbers in seque

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread William Pearson
2008/8/10 Mike Tintner <[EMAIL PROTECTED]>: > Just as you are in a rational, specialist way picking off isolated features, > so, similarly, rational, totalitarian thinkers used to object to the crazy, > contradictory complications of the democratic, "conflict" system of > decisionmaking by contrast

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread wannabe
Interesting conversation. I wanted to suggest something about how an AGI might be qualitatively different from human. One possible difference could be an overriding thoroughness. People generally don't put in the effort to consider all the possibilities in the decisions they make, but computers

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
> > And I've said it before, but it bears repeating in this context. Real > intelligence requires that mistakes be made. And that's at odds with > regular programming, because you are trying to write programs that don't > make mistakes, so I have to wonder how serious people really would be > abo

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread wannabe
me: >> And I've said it before, but it bears repeating in this context. Real >> intelligence requires that mistakes be made. And that's at odds with >> regular programming, because you are trying to write programs that don't >> make mistakes, so I have to wonder how serious people really would be

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
yes. This is one of the reasons why I like "virtual world and game AI" as a commercial vehicle for the popularization and monetization of early-stage AGI's. No one cares that much if a game AI occasionally does something dumb. It may even be considered charmingly funny. Much more so than if the

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Will, Maybe I should have explained the distinction more fully. A totalitarian system is one with an integrated system of decisionmaking, and unified goals. A "democratic", "conflict system is one that takes decisions with opposed, conflicting philosophies and goals (a la Democratic vs Republi

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
On Sun, Aug 10, 2008 at 5:52 PM, Mike Tintner <[EMAIL PROTECTED]>wrote: > Will, > > Maybe I should have explained the distinction more fully. A totalitarian > system is one with an integrated system of decisionmaking, and unified > goals. A "democratic", "conflict system is one that takes decision

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread William Pearson
2008/8/10 Mike Tintner <[EMAIL PROTECTED]>: > Will, > > Maybe I should have explained the distinction more fully. A totalitarian > system is one with an integrated system of decisionmaking, and unified > goals. A "democratic", "conflict system is one that takes decisions with > opposed, conflicting

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Ben:By true rationality I simply mean making judgments in accordance with probability theory based on one's goals and the knowledge at one's disposal. Which is not applicable to AGI prob lems, which are wicked and ill-structured, and where you cannot calculate probabilities, and are not sure of

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
> Or even simpler problems, like : how were you to handle the angry Richard > recently? Your response, and I quote: "Aaargh!" (as in "how on earth do I > calculate my probabilities and Bayes?" and "which school of psychological > thought is relevant here?") Now you're talking AGI. There is no ratio

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Mike Tintner
Will: thought you meant rational as applied to the system builder :P Consistency of systems is overrated, as far as I am concerned. Consistency is only important if it ever the lack becomes exploited. A system that alter itself to be consistent after the fact is sufficient. Do you remember when I

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread John LaMuth
- Original Message - From: Ben Goertzel To: agi@v2.listbox.com Sent: Sunday, August 10, 2008 8:00 AM Subject: Re: [agi] The Necessity of Embodiment ... the best approaches are 1) wait till the brain scientists scan the brain well enough that, by combining appropriate

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread Ben Goertzel
Sent:* Sunday, August 10, 2008 8:00 AM > *Subject:* Re: [agi] The Necessity of Embodiment > > > ... the best approaches are > > 1) wait till the brain scientists scan the brain well enough that, by > combining appropriate neurocognitive theory w/ brain scan results, we can >

Re: [agi] The Necessity of Embodiment

2008-08-10 Thread John LaMuth
expedited. John L www.emotionchip.net - Original Message - From: Ben Goertzel To: agi@v2.listbox.com Sent: Sunday, August 10, 2008 5:58 PM Subject: Re: [agi] The Necessity of Embodiment I am aware of textbook neuroscience, but it really does not tell you enough to let you emulate the br

Re: [agi] The Necessity of Embodiment

2008-08-11 Thread William Pearson
2008/8/11 Mike Tintner <[EMAIL PROTECTED]>: > Will: thought you meant rational as applied to the system builder :P > Consistency of systems is overrated, as far as I am concerned. > Consistency is only important if it ever the lack becomes exploited. A > system that alter itself to be consistent af

Re: [agi] The Necessity of Embodiment

2008-08-11 Thread Mike Tintner
Ben/MT: Cog sci treats humans as if we are rational, consistent thinkers/ computers. No, it just doesn't. This is an egregious oversimplification and mis-analysis of the cognitive science community and its research and ideas. Look at the heuristics and biases literature, for one thing... a

Re: [agi] The Necessity of Embodiment

2008-08-11 Thread Jim Bromer
On Sun, Aug 10, 2008 at 6:43 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote: > You'll be convinced in time, for instance if an OpenCogPrime instance starts > refuting your arguments on this mailing list! That is not very likely to occur if the 'magic' of agi is absolutely dependent on embodiment, and

Re: [agi] The Necessity of Embodiment

2008-08-11 Thread Jim Bromer
I was probably too confrontational with my last message. I know that there are no solid reasons to believe that some kind of embodiment is absolutely necessary for the advancement of agi. However, there is nothing wrong with making that effort if someone wants to, just as there is nothing wrong w

Re: [agi] The Necessity of Embodiment

2008-08-13 Thread Jim Bromer
There is a another reason why embodied agi is useful. That is because the challenge will provide some discipline for the programmer who might otherwise never confront the structural problems that I believe are fundamental to the problem of developing genuine agi. Jim Bromer -

Re: [agi] The Necessity of Embodiment

2008-08-13 Thread James Ratcliff
embodiment? ___ James Ratcliff - http://falazar.com Looking for something... --- On Wed, 8/13/08, Jim Bromer <[EMAIL PROTECTED]> wrote: From: Jim Bromer <[EMAIL PROTECTED]> Subject: Re: [agi] The Necessity of Embodiment To: agi@v2.listbox.com Dat

Re: [agi] The Necessity of Embodiment

2008-08-13 Thread Jim Bromer
__ > James Ratcliff - http://falazar.com > Looking for something... > > --- On Wed, 8/13/08, Jim Bromer <[EMAIL PROTECTED]> wrote: > > From: Jim Bromer <[EMAIL PROTECTED]> > Subject: Re: [agi] The Necessity of Embodiment > To: agi@v2.listbox.com > Date: Wed

Re: [agi] The Necessity of Embodiment

2008-08-13 Thread Jonathan El-Bizri
On Wed, Aug 13, 2008 at 5:04 AM, Jim Bromer <[EMAIL PROTECTED]> wrote: > There is a another reason why embodied agi is useful. That is because > the challenge will provide some discipline for the programmer who > might otherwise never confront the structural problems that I believe > are fundamen

Re: [agi] The Necessity of Embodiment

2008-08-13 Thread Jim Bromer
On Wed, Aug 13, 2008 at 3:16 PM, Jonathan El-Bizri <[EMAIL PROTECTED]> wrote: > On Wed, Aug 13, 2008 at 5:04 AM, Jim Bromer <[EMAIL PROTECTED]> wrote: >> There is a another reason why embodied agi is useful. That is because >> the challenge will provide some discipline for the programmer who >> mi

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Mike Tintner
Jim:I know that there are no solid reasons to believe that some kind of embodiment is absolutely necessary for the advancement of agi. I want to concentrate on one dimension of this: precisely the "solid" dimension. My guess would be that this is a dimension of AGI that has been barely thought

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Bob Mottram
2008/8/14 Mike Tintner <[EMAIL PROTECTED]>: > What it comes down to is: what can you learn about any object[s] from flat > drawings of them? Cardboard cutouts? This is essentially the same problem as in computer vision. The objects that you're looking at are three dimensional, but a camera image

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Jim Bromer
This is also a problem in animal vision. Each eye is 2-D. (That is not entirely true, but from a practical point of view it is true.) As far as flat land or hollywood land, we only live on the earth, so that means that you can't understand anything about space right? Well, your ideas about the u

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Mike Tintner
Jim:This is also a problem in animal vision. Each eye is 2-D. (That is not entirely true, but from a practical point of view it is true.) As far as flat land or hollywood land, we only live on the earth, so that means that you can't understand anything about space right? Logic running wild, Jim

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Ben Goertzel
On Thu, Aug 14, 2008 at 6:59 AM, Mike Tintner <[EMAIL PROTECTED]>wrote: > Jim:I know that > there are no solid reasons to believe that some kind of embodiment is > absolutely necessary for the advancement of agi. > > I want to concentrate on one dimension of this: precisely the "solid" > dimension

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Vladimir Nesov
Having information about all the details of 3D scenes leaves the agent about as limited as having only 2D camera snapshots, or verbal descriptions, if it is not able to extract a language of causal models from this information. Static description of a scene, however precise, is no use if you can no

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Mike Tintner
Ben: as discussed already ad nauseum, I do not think that robust perception/action is necessarily the best place to start in making an AGI. However, our current work on embodying Novamente and OpenCog does involve 3D virtual worlds ... and, of course, my planned work with Xiamen University using

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Ben Goertzel
Well I am definitely a philosopher-scientist and not a PR guy ;-) Perhaps the confusion is just that I don't think there is any one exclusively correct approach. I think 3D robotic or virtual embodiment are very convenient approaches so I am following these paths w/ OpenCog and Novamente (with an

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Bob Mottram
2008/8/14 Mike Tintner <[EMAIL PROTECTED]>: > But - correct me - when you engineer the 3D shape, you are merely applying > previous,existing knowledge about other objects to do so - which is a useful > but narrow AI function. You are not actually discovering anything new about > this particular obj

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Jim Bromer
Of course I have considered these issues before. On Thu, Aug 14, 2008 at 8:41 AM, Mike Tintner <[EMAIL PROTECTED]> wrote: > Jim:This is also a problem in animal vision. Each eye is 2-D. (That is > not entirely true, but from a practical point of view it is true.) > As far as flat land or hollyw

Re: [agi] The Necessity of Embodiment

2008-08-14 Thread Jim Bromer
> Sorry if my phrasing or tone were "off", I probably shoulda got more sleep > last night! > > I am not at all frustrated by discussions of the specific role that 3D > perception and visualization plays in human or humanlike cognition. Very > interesting, worthwhile topic! > > I get frustrated by

Re: [agi] The Necessity of Embodiment

2008-08-21 Thread Valentina Poletti
Sorry if I'm commenting a little late to this: just read the thread. Here is a question. I assume we all agree that intelligence can be defined as the ability to achieve goals. My question concerns the establishment of those goals. As human beings we move in a world of limitations (life span, ethic

Re: [agi] The Necessity of Embodiment

2008-08-21 Thread Vladimir Nesov
On Thu, Aug 21, 2008 at 5:33 PM, Valentina Poletti <[EMAIL PROTECTED]> wrote: > Sorry if I'm commenting a little late to this: just read the thread. Here is > a question. I assume we all agree that intelligence can be defined as the > ability to achieve goals. My question concerns the establishment

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Valentina Poletti
Thanks Vlad, I read all that stuff plus other Eliezer papers. They don't answer my question: I am asking what is the use of a non-embodied AGI, given it would necessarily have a different goal system from that of humans, I'm not asking how to make any AGI friendly - that is extremely difficult. O

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Valentina Poletti
Jim, I was wondering why no-one had brought up the information-theoretic aspect of this yet. Are you familiar at all with the mathematics behind such a description of AGI? I think it is key so I'm glad someone else is studying that as well. --- agi Archi

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Vladimir Nesov
On Fri, Aug 22, 2008 at 3:23 PM, Valentina Poletti <[EMAIL PROTECTED]> wrote: > Thanks Vlad, I read all that stuff plus other Eliezer papers. They don't > answer my question: I am asking what is the use of a non-embodied AGI, given > it would necessarily have a different goal system from that of hu

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Terren Suydam
She's not asking about the kind of embodiment, she's asking what's the use of a non-embodied AGI. Your quotation, dealing as it does with low-level input, is about embodied AGI. --- On Fri, 8/22/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > > Thanks Vlad, I read all that stuff plus other Elie

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Vladimir Nesov
On Fri, Aug 22, 2008 at 5:35 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > She's not asking about the kind of embodiment, she's asking what's the use of > a non-embodied AGI. Your quotation, dealing as it does with low-level input, > is about embodied AGI. > I believe "non-embodied" meant to

RE: [agi] The Necessity of Embodiment

2008-08-22 Thread Derek Zahn
By "embodied" I think people usually mean a dense sensory connection (with a feedback loop) to the physical world. The feedback could be as simple as aiming a camera. However, it seems to me that an AI program connected to YouTube could maybe have a dense enough link to the "real" world to cha

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Vladimir Nesov
On Fri, Aug 22, 2008 at 5:49 PM, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > On Fri, Aug 22, 2008 at 5:35 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: >> >> She's not asking about the kind of embodiment, she's asking what's the use >> of a non-embodied AGI. Your quotation, dealing as it does with

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Valentina Poletti
Ben, Being one of those big-headed children myself.. I have just a peculiar comment. You probably know this but human intelligence is not limited to the size of the human skull. That is why communication and social skills are such important keys to intelligence. An individual by himself can do ver

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Jim Bromer
On Fri, Aug 22, 2008 at 7:30 AM, Valentina Poletti <[EMAIL PROTECTED]> wrote: > Jim, > I was wondering why no-one had brought up the information-theoretic aspect > of this yet. Are you familiar at all with the mathematics behind such a > description of AGI? I think it is key so I'm glad someone el

Re: [agi] The Necessity of Embodiment

2008-08-22 Thread Jim Bromer
On Fri, Aug 22, 2008 at 7:30 AM, Valentina Poletti <[EMAIL PROTECTED]> wrote: > Jim, > I was wondering why no-one had brought up the information-theoretic aspect > of this yet. Are you familiar at all with the mathematics behind such a > description of AGI? I think it is key so I'm glad someone el

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Terren Suydam
Yeah, that's where the misunderstanding is... "low level input" is too fuzzy a concept. I don't know if this is the accepted mainstream definition of embodiment, but this is how I see it. The thing that distinguishes an embodied agent from an unembodied one is whether the agent is given pre-st

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Terren Suydam
Just wanted to add something, to bring it back to feasibility of embodied/unembodied approaches. Using the definition of embodiment I described, it needs to be said that it is impossible to specify the goals of the agent, because in so doing, you'd be passing it information in an unembodied way

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Vladimir Nesov
On Sat, Aug 23, 2008 at 11:38 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > Just wanted to add something, to bring it back to feasibility of > embodied/unembodied approaches. Using the definition of embodiment > I described, it needs to be said that it is impossible to specify the goals > of the

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Mike Tintner
Terren:> Just wanted to add something, to bring it back to feasibility of embodied/unembodied approaches. Using the definition of embodiment I described, it needs to be said that it is impossible to specify the goals of the agent, because in so doing, you'd be passing it information in an unemb

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Terren Suydam
comments below... --- On Sat, 8/23/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > The last post by Eliezer provides handy imagery for this > point ( > http://www.overcomingbias.com/2008/08/mirrors-and-pai.html > ). You > can't have an AI of perfect emptiness, without any > goals at all, > becaus

Re: [agi] The Necessity of Embodiment

2008-08-23 Thread Eric Burton
I kind of feel this way too. It should be easy to get neural nets embedded in VR to achieve the intelligence of say magpies, or finches. But the same approaches you might use, top-down ones, may not scale to human level. Given a 100x increase in workstation capacity I don't see why we can't start

Re: [agi] The Necessity of Embodiment

2008-08-24 Thread Vladimir Nesov
On Sun, Aug 24, 2008 at 7:28 AM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > --- On Sat, 8/23/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > >> But you >> can have an AI that has a bootstrapping mechanism that >> tells it where >> to look for goal content, tells it to absorb it and embrace >> it.

Re: [agi] The Necessity of Embodiment

2008-08-24 Thread Terren Suydam
--- On Sun, 8/24/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > What do you mean by "does not structure"? What do > you mean by fully or > not fully embodied? I've already discussed what I mean by embodiment in a previous post, the one that immediately preceded the post you initially responded

Re: [agi] The Necessity of Embodiment

2008-08-24 Thread Vladimir Nesov
On Sun, Aug 24, 2008 at 5:51 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > >> Did you read CFAI? At least it dispels the mystique and >> ridicule of >> "provable" Friendliness and shows what kind of >> things are relevant for >> its implementation. You don't really want to fill the >> universe wit

Re: [agi] The Necessity of Embodiment

2008-08-24 Thread wannabe
Valentina wrote: >Sorry if I'm commenting a little late to this: just read the thread. Here >is a question. I assume we all agree that intelligence can be defined as >the ability to achieve goals. My question concerns the establishment of >those goals. As human beings we move in a world of limitati

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Valentina Poletti
In other words, Vladimir, you are suggesting that an AGI must be at some level controlled from humans, therefore not 'fully-embodied' in order to prevent non-friendly AGI as the outcome. Therefore humans must somehow be able to control its goals, correct? Now, what if controlling those goals woul

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Vladimir Nesov
On Mon, Aug 25, 2008 at 1:07 PM, Valentina Poletti <[EMAIL PROTECTED]> wrote: > In other words, Vladimir, you are suggesting that an AGI must be at some > level controlled from humans, therefore not 'fully-embodied' in order to > prevent non-friendly AGI as the outcome. Controlled in Friendliness

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Valentina Poletti
On 8/25/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > > On Mon, Aug 25, 2008 at 1:07 PM, Valentina Poletti <[EMAIL PROTECTED]> > wrote: > > In other words, Vladimir, you are suggesting that an AGI must be at some > > level controlled from humans, therefore not 'fully-embodied' in order to > > pre

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Vladimir Nesov
On Mon, Aug 25, 2008 at 6:23 PM, Valentina Poletti <[EMAIL PROTECTED]> wrote: > > On 8/25/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: >> >> Why would anyone suggest creating a disaster, as you pose the question? >> > > Also agree. As far as you know, has anyone, including Eliezer, suggested any >

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Terren Suydam
Hi Vlad, Thanks for taking the time to read my article and pose excellent questions. My attempts at answers below. --- On Sun, 8/24/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > On Sun, Aug 24, 2008 at 5:51 PM, Terren Suydam > What is the point of building general intelligence if all > it doe

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread William Pearson
2008/8/25 Terren Suydam <[EMAIL PROTECTED]>: > > --- On Sun, 8/24/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: >> On Sun, Aug 24, 2008 at 5:51 PM, Terren Suydam >> wrong. This ability might be an end in itself, the whole >> point of >> building an AI, when considered as applying to the dynamics >>

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Terren Suydam
Hi Will, I don't doubt that provable-friendliness is possible within limited, well-defined domains that can be explicitly defined and hard-coded. I know chess programs will never try to kill me. I don't believe however that you can prove friendliness within a framework that has the robustness

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Eric Burton
Is friendliness really so context-dependent? Do you have to be human to act friendly at the exception of acting busy, greedy, angry, etc? I think friendliness is a trait we project onto things pretty readily implying it's wired at some fundamental level. It comes from the social circuits, it's abou

Re: [agi] The Necessity of Embodiment

2008-08-25 Thread Terren Suydam
Eric, We're talking Friendliness (capital F), a convention suggested by Eliezer Yudkowsky, that signifies the sense in which an AI does no harm to humans. Yes, it's context dependent. "Do no harm" is the mantra within the medical community, but clearly there are circumstances in which you do a

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Vladimir Nesov
On Tue, Aug 26, 2008 at 7:53 AM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > Or take any number of ethical dilemmas, in which it's ok to steal food if it's > to feed your kids. Or killing ten people to save twenty. etc. How do you > define > Friendliness in these circumstances? Depends on the con

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Terren Suydam
Are you saying Friendliness is not context-dependent? I guess I'm struggling to understand what a "conceptual dynamics" would mean that isn't dependent on context. The AGI has to act, and at the end of the day, its actions are our only true measure of its Friendliness. So I'm not sure what it

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Vladimir Nesov
On Mon, Aug 25, 2008 at 11:09 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > --- On Sun, 8/24/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: >> On Sun, Aug 24, 2008 at 5:51 PM, Terren Suydam >> What is the point of building general intelligence if all >> it does is >> takes the future from us and

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Vladimir Nesov
On Tue, Aug 26, 2008 at 8:05 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > Are you saying Friendliness is not context-dependent? I guess I'm > struggling to understand what a "conceptual dynamics" would mean > that isn't dependent on context. The AGI has to act, and at the end of the > day, its

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Terren Suydam
If Friendliness is an algorithm, it ought to be a simple matter to express what the goal of the algorithm is. How would you define Friendliness, Vlad? --- On Tue, 8/26/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > It is expressed in individual decisions, but it isn't > these decisions > themse

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Vladimir Nesov
On Tue, Aug 26, 2008 at 8:54 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > If Friendliness is an algorithm, it ought to be a simple matter to express > what the goal of the algorithm is. How would you define Friendliness, Vlad? > Algorithm doesn't need to be simple. The actual Friendly AI that

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Terren Suydam
ss is not objective. Therefore, it cannot be expressed formally. It can only be approximated, with error. --- On Tue, 8/26/08, Vladimir Nesov <[EMAIL PROTECTED]> wrote: > From: Vladimir Nesov <[EMAIL PROTECTED]> > Subject: Re: [agi] The Necessity of Embodiment > To: agi@v2.list

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Vladimir Nesov
On Tue, Aug 26, 2008 at 9:54 PM, Terren Suydam <[EMAIL PROTECTED]> wrote: > > I didn't say the algorithm needs to be simple, I said the goal of > the algorithm ought to be simple. What are you trying to compute? > > Your answer is, "what is the right thing to do?" > > The obvious next question is,

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Valentina Poletti
Vlad, Terren and all, by reading your interesting discussion, this saying popped in my mind.. admittedly it has little to do with AGI but you might get the point anyhow: An old lady used to walk down a street everyday, and on a tree by that street a bird sang beautifully, the sound made her happy

Re: [agi] The Necessity of Embodiment

2008-08-26 Thread Terren Suydam
It doesn't matter what I do with the question. It only matters what an AGI does with it. I'm challenging you to demonstrate how Friendliness could possibly be specified in the formal manner that is required to *guarantee* that an AI whose goals derive from that specification would actually "d

  1   2   3   >