I think the whole idea of a semantic layer is to provide the kind of
mechanism for abstract reasoning that evolution seems to have built
into the human brain. You could argue that those faculties are
acquired during one's life, using only a weighted neural net (brain),
but it seems reasonable to assume that to some extent they're
genetically coded for. To that extent they ought to be specifically
coded for in any programmatic reproduction of the brain's abilities.

At some point hardcoding higher-order functionality is a cheat, but
there is a certain amount of architecture a thinking machine isn't
going to work without.

On 9/19/08, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> --- On Fri, 9/19/08, Jiri Jelinek <[EMAIL PROTECTED]> wrote:
>
>> Try "What's the color of Dan Brown's black coat?" What's the excuse
>> for a general problem solver to fail in this case? NLP? It
>> then should use a formal language or so. Google uses relatively good
>> search algorithms but decent general problem solving IMO requires
>> very different algorithms/design.
>
> So, what formal language model can solve this problem? First order logic?
> Uncertain logic (probability and confidence)? Logic augmented with notions
> of specialization, time, cause and effect, etc.
>
> There seems to be a lot of effort to implement reasoning in knowledge
> representation systems, even though it has little to do with how we actually
> think. We focus on problems like:
>
> All men are mortal. Socrates is a man. Therefore ___?
>
> The assumed solution is to convert it to a formal representation and apply
> the rules of logic:
>
> For all x: man(x) -> mortal(x)
> man(Socrates)
> => mortal(Socrates)
>
> which has 3 steps: convert English to a formal representation (hard AI),
> solve the problem (easy), and convert back to English (hard AI).
>
> Sorry, that is not a solution. Consider how you learned to convert natural
> language to formal logic. You were given lots of examples and induced a
> pattern:
>
> Frogs are green = for all x: frog(x) -> green(x).
> Fish are animals = for all x: fish(x) -> animal(x).
> ...
> Y are Z: for all x: Y(x) -> Z(x).
>
> along with many other patterns. (Of course, this requires learning semantics
> first, so you don't confuse examples like "they are coming").
>
> But if you can learn these types of patterns then with no additional effort
> you can learn patterns that directly solve the problem...
>
> Frogs are green. Kermit is a frog. Therefore Kermit is green.
> Fish are animals. A minnow is a fish. Therefore a minnow is an animal.
> ...
> Y are Z. X is a Y. Therefore X is a Z.
> ...
> Men are mortal. Socrates is a man. Therefore Socrates is mortal.
>
> without ever going to a formal representation. People who haven't studied
> logic or its notation can certainly learn to do this type of reasoning.
>
> So perhaps someone can explain why we need formal knowledge representations
> to reason in AI.
>
> -- Matt Mahoney, [EMAIL PROTECTED]
>
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to