The problem of syntactic ambiguity is AI-complete if you treat ambiguity as a Boolean problem in which the only possible solutions are precise. This leads me to the problem with the ancient belief that weighted methods (including Bayesian and other probability methods) were - in themselves - solutions to AGI complexity. It should be obvious that weighted reasoning is not in itself sufficient to distinguish between imperfect solutions just because it used partial valuations. Discreet collections are necessary for the discovery of reasons (including reasons based on the finding of correlated instances) which might be used in the discovery of special cases or other major branches of cases. The argument of equivalence is not truly relevant here because it is the value of the collection that comes bearing the potential of a possible solution process. In particular those sub-categorizations of the collections which have some relevant structural relations (like analogy or similarity and reasons) can point to a way to find other good solution even though it might be imperfect in some other way.
On Wed, Nov 27, 2013 at 7:36 PM, Jim Bromer <[email protected]> wrote: > The problem of syntactic ambiguity is AI-complete if you treat ambiguity > as a Boolean problem in which the only possible solutions are precise. The > thing is that many solutions to problems will often work well but fail in > other cases. This is typical of AI methods, but many of the failures of > AGI occur at very fundamental stages so that we do not see examples of AGI > achieving much genuine learning. The idea that AGI has to be an > oracle capable of all human potential is nonsense. Why should AGI be > limited only to the human potential for getting the right answers since > making mistakes is an essential part of the possession of insight. But AI > / AGI has to have the ability (the potential) to learn from mistakes. The > other thing is that AGI has to be able to learn to deal with novelty, not > that it has to know how to deal perfectly with novel situations every time > something novel occurs. > > > On Tue, Nov 26, 2013 at 4:01 PM, Steve Richfield < > [email protected]> wrote: > >> Hi all, >> >> As you doubtless recall, I submitted a patent a few months ago on a >> faster parsing method. Being of Medicare age, the USPTO is QUICKLY >> processing my application, and has already OK'd 4 of my claims with minor >> corrections. In considering other claims, they countered with a 1992 >> article by Hobbs, et al, that was off point. However, the Hobbs article >> contained some interesting statements that I thought might stimulate >> discussion here: >> >> *The problem of syntactic ambiguity is AI-complete. That is, we will not >> have systems that reliably parse English sentences correctly until we have >> encoded much of the real-world knowledge that people bring to bear in their >> language comprehension.* >> >> Hobbs then goes on to utilize this as justification for constructing his >> ad hoc parsing method consisting of several passes, complete with a variety >> of recognized weaknesses that he accepts as being unavoidable given the >> difficulty of the problem, which seems to also be the general direction of >> people here. >> >> Another quote from Hobbs: >> >> >> *Q: What is the difference between computer science and artificial >> intelligence? * >> *A: In computer science you write programs to do quickly what people do >> slowly. In artificial intelligence, it is just the opposite.* >> >> Hobbs then goes on to argue that his approach is computer science rather >> than AI. >> >> Hobbs wrote several articles that were largely copied one to the next, so >> you can search via Google on the above quotes to find them. >> >> Any thoughts? >> >> Steve >> >> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now> >> <https://www.listbox.com/member/archive/rss/303/24379807-f5817f28> | >> Modify<https://www.listbox.com/member/?&>Your Subscription >> <http://www.listbox.com> >> > > > > -- > Jim Bromer > -- Jim Bromer ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
