I still believe that a natural language calculator is possible although I consider it to be a little out there. Obviously it would not be as simple as a mathematical calculator and it would not work on an application of numerical calculations but on principles of categorical substitution and categorical analogy. This is more closely related to Chomsky's grammatical hierarchy than I intend to use with my current program next year, but it is not constrained by pre determined categories and so on. So it will be able to deal with novel situations but it will have to learn how to deal with those situations and obviously it is not going to be an oracle of imagined perfection. It is really difficult for me to think of some examples of this before hand but once I get my program working on AI situations I am sure I will have a lot more to work with.
On Wed, Nov 27, 2013 at 7:36 PM, Jim Bromer <[email protected]> wrote: > The problem of syntactic ambiguity is AI-complete if you treat ambiguity > as a Boolean problem in which the only possible solutions are precise. The > thing is that many solutions to problems will often work well but fail in > other cases. This is typical of AI methods, but many of the failures of > AGI occur at very fundamental stages so that we do not see examples of AGI > achieving much genuine learning. The idea that AGI has to be an > oracle capable of all human potential is nonsense. Why should AGI be > limited only to the human potential for getting the right answers since > making mistakes is an essential part of the possession of insight. But AI > / AGI has to have the ability (the potential) to learn from mistakes. The > other thing is that AGI has to be able to learn to deal with novelty, not > that it has to know how to deal perfectly with novel situations every time > something novel occurs. > > > On Tue, Nov 26, 2013 at 4:01 PM, Steve Richfield < > [email protected]> wrote: > >> Hi all, >> >> As you doubtless recall, I submitted a patent a few months ago on a >> faster parsing method. Being of Medicare age, the USPTO is QUICKLY >> processing my application, and has already OK'd 4 of my claims with minor >> corrections. In considering other claims, they countered with a 1992 >> article by Hobbs, et al, that was off point. However, the Hobbs article >> contained some interesting statements that I thought might stimulate >> discussion here: >> >> *The problem of syntactic ambiguity is AI-complete. That is, we will not >> have systems that reliably parse English sentences correctly until we have >> encoded much of the real-world knowledge that people bring to bear in their >> language comprehension.* >> >> Hobbs then goes on to utilize this as justification for constructing his >> ad hoc parsing method consisting of several passes, complete with a variety >> of recognized weaknesses that he accepts as being unavoidable given the >> difficulty of the problem, which seems to also be the general direction of >> people here. >> >> Another quote from Hobbs: >> >> >> *Q: What is the difference between computer science and artificial >> intelligence? * >> *A: In computer science you write programs to do quickly what people do >> slowly. In artificial intelligence, it is just the opposite.* >> >> Hobbs then goes on to argue that his approach is computer science rather >> than AI. >> >> Hobbs wrote several articles that were largely copied one to the next, so >> you can search via Google on the above quotes to find them. >> >> Any thoughts? >> >> Steve >> >> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now> >> <https://www.listbox.com/member/archive/rss/303/24379807-f5817f28> | >> Modify<https://www.listbox.com/member/?&>Your Subscription >> <http://www.listbox.com> >> > > > > -- > Jim Bromer > -- Jim Bromer ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
