From: "Mark Waser" <[EMAIL PROTECTED]> Subject: Re: [agi] Rationalism and Scientific Rationalism: Was Logical Satisfiability...Get used to it.
It looks as if you're saying that scientific rationalism must be grounded but that rationalism in general need not be. Is this a correct interpretation? --------------------------------------------- No, yes and I'm not sure. I would like to write a message about an artificial rationalism and an artificial empirical rationalism. I am not going to try to write about an AI architecture, but I do want to write in terms that can lend themselves to a discussion of how rationalism and empirical rationalism can be designed into an AGI program. However, this is not meant as a definitive statement on the various ways that the words and concepts behind the words 'rationalism' and 'empiricism' are used. (The phrase empirical rationalism is probably a better term for me to use then scientific rationalism.) But yes, in general, I feel that scientific rationalism and empirical rationalism have to be more grounded than simple rationalism, especially when we are trying to understand how these concepts can be applied to an advanced AGI program. But on the other hand, the concept of grounding may be too strong a term. Think of an AGI program that can learn from a natural language text-based IO but does not have any other kind of IO. I would argue that there has to be a distinction between the definition of rationalism (using some kind of applied logic-based systems) and empirical rationalism (which also has some kind of experimental way of grounding ideas and conjectures, and some kind of conceptual integration as well). The problem with this example however, is that the same conceptual functions are being used to devise conjectures about the IO data environment as are used to test those conjectures. So there is a real question about the depth of the 'grounding' since the problem is so obviously tricky. It is my belief that while the concept of grounding is important for advanced AGI, it is itself no more solid a premise than the other concepts used in AGI. But I do believe that some kind of 'grounding' is absolutely necessary for it. Jim Bromer ------------------------------------------- agi Archives: http://www.listbox.com/member/archive/303/=now RSS Feed: http://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4 Powered by Listbox: http://www.listbox.com