On 1/24/07, YKY (Yan King Yin) <[EMAIL PROTECTED]> wrote:

Suppose I have a set of *deductive* facts/rules in FOPL.  You can actually
use this data in your AGI to support other forms of inference such as
induction and abduction.  In this sense the facts/rules collection does not
dictate the form of inference engine we use.

No, you cannot do that without twisting some definitions. You are
right that now many people define "induction" and "abduction" in the
language of FOPL, but what they actually do is to omit important
aspects in the process, such as uncertainty. To me that is cheating. I
addressed this issue in
http://nars.wang.googlepages.com/wang.syllogism.ps . In
http://www.springer.com/west/home/computer/artificial?SGWID=4-147-22-173659733-0
I explained in detail (especially in Ch. 9 and 10) why the language of
FOPL is improper for AI.

I am still reading your book, but I found numerous good ideas in it.  I know
that you treat deduction, induction, and abduction in a unified way.  That
is a very elegant theory but it may have problems.  For example, if:

(1) I read a lot of books
(2) I hate my mom

your system may infer by induction that "reading a lot of books -> hating
ones mom".  In some instances doing this is meaningful, but in general your
system may be flooded with a lot of these speculative statements, drawing
time from the day-to-day deductive operations.

Yes, a conclusion like that will be derived, if the system has nothing
better to do. Such a conclusion looks stupid only after the system
knows much more about the related concepts.

The system won't be flooded with a lot of these speculative
statements, as far as it doesn't try to derive every possible
conclusion, or to treat every conclusion as equally important.

I tend to think of induction as something less essential than deduction.

In general, I agree --- that is why inductive/abductive conclusions
are in general less confident than deductive ones in NARS. However,
for a concrete problem, the crucial step may be provided initially by
a hypothesis with low confidence.

That's why my top priority is to build an inference engine for deduction.
Inductive learning will be added later in the form of data mining, which is
very computation-intensive.

I'm afraid it is not going to work --- many people have tried to
extend FOPL to cover a wider range, and run into all kinds of
problems. To restart from scratch is actually easier than to maintain
consistency among many ad hoc patches and hacks.

To me, one of the biggest mistake of mainstream AI is to treat
"learning" as independent to "working", and can be added in later. To
see AI in this way and to put learning into the foundation will
produce very different systems. In NARS, "learning" and "reasoning",
as well as some other "cognitive facilities", are different aspects of
the same underlying process, and cannot handled separately.

Pei


YKY ________________________________
 This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to