On Thu, Nov 27, 2014 at 08:03:27AM +0800, Ben Goertzel via AGI wrote: > It's clear that prediction is one of the important capabilities that > an AGI needs to have
predictions seems like something with a level of certainty, which I don't believe is plausible to expect in the real world. It's utterly impossible to predict, with much precision, the majority of what goes on in the world. I don't sit around thing about what might happen, that's idle daydreaming, inteferes with work. > > What's less clear is whether it's productive to view predictive as > **the core** cognitive functionality of human-level intelligence, as > Jeff Hawkins and others have suggested ya, that seems a Deeply flawed view in my opinion. prediction with precision is impossible due to chaos theory. Sure there may be some patterns, that help us get by, but often enough those patterns are broken, real life doesn't conform to strict ones. It's dealing with whats broken, what needs fixing, what is not as expected, where much intelligence is required. > > -- Ben G > > On Thu, Nov 27, 2014 at 7:37 AM, Matt Mahoney via AGI <[email protected]> > wrote: > > On Wed, Nov 26, 2014 at 6:10 AM, Logan Streondj via AGI <[email protected]> > > wrote: > >> prediction and AGI have almost nothing in common. > >> I don't know why people here are so stuck up on it. > > > > You are able to catch a ball because you can predict how it will move. > > You are able to get to work on time because you predict that when you > > set your alarm clock that it will wake you up in the morning and you > > can predict how long it will take to get there after you wake up. I prefer to call that estimation. I certainly would Never call it prediction. Estimating the trajectory of a ball, estimating amount of time it takes to get to work, how much time it takes to get ready, subtracting those from arrival time, to get alarm time. I don't predict my alarm will work, at most I would hope that it does, it might not, who knows. could be a power outage, could run out of batteries. I'm happy I don't have to relly on an alarm. though I know they can be "trusty" things, you could have a certain level of trust, that it will function as it should, could plan for it to do so. but I wouldn't say predict. > > You > > are able to understand my words because you can predict a large > > fraction of them and only need to remember the differences. that just sounds like a nonsensical statement. fact is I can't predict what you were going to say at all. Nor how you would put your words together. I can understand them because they conform to grammar structures, and use words which I have experienced in a variet of contexts, so reading them brings up those contexts, and the structures orients them, forming the new concepts, which you convey. > > Just > > because you aren't consciously aware of your own thought processes > > doesn't mean you don't think. > > > > -- > > -- Matt Mahoney, [email protected] I do a fair amount of meditation and meta-thought analysis. I must admit that I predict things very rarely if at all. Those usually only come by way of my "spirit-guide" or "higher-self". Even those are at times incorrect, no one is immune to change. Most of the time I think about the future it is planning. Planning is a far cry from predicting, many of my plans fall through, and I have to revise, due to reality, it changes. Planning is completely different from predicting, planing is not some kind of mystical experience, there is lots of information available on it, can make lists, programs, charts, calendars. Of course have to act on those plans, and adapt to inevitable changes. Like in evolution, “It’s not the strongest who survive, nor the most intelligent, but the ones most adaptable to change.” Charles Darwin AGI's must adapt to unexpected environmental changes, literally ones which are impossible to predict. A lot of lower animals live in a reactionary manner, simply acting on internal feelings and impulses, adaptive state machines of sorts. Higher animals have an element of planing, but the foundation is the same. For instance when my wife says "his diaper is wet" I don't predict anything. she's assuming I'm functioning as a state machine, and that i know the ideal state is a dry diaper, and thus I must do something to get to ideal state. The only reason she can assume this, is because this conversation has happened before, so the desired result is already preprogrammed in me. I did not predict at first what needed to happen, I just said "okay", and continud what I was doing, when I idled too long she said "so get him a dry one, and put it on him". Thus she programmed me, with the beauty of language, grammar and vocabulary. -- Logan Streondj ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
