On Thu, 08 Feb 2007 10:22:19 -0500, Ben Goertzel <[EMAIL PROTECTED]> wrote:

Well, if the scope of a mind is narrowed enough, then it can be more coherent.

Right, I understand there is a definite trade-off here between knowledge or scope and coherency, due mainly to resource limitations. The best we can hope for is that an AGI might be more coherent than us, but this is by no means assured.

On a slightly different but closely related subject...

Last night I was out having pizza with some others, trying to pretend to be interested in the conversation, while actually thinking about the posts we had exchanged earlier in the day. :) While munching on onions and pepperoni it occurred to me that the problem of achieving complete or near-complete coherency in AGI is closely related to the epistemological problem of obtaining knowledge where knowledge is defined as 'justified true belief'. Karl Popper's arguments against that possibility strike me as similar to and closely related to your arguments against the possibility of complete probabilistic coherency in AGI: any such attempt must lead to an infinite regress.

So then I wondered to myself how Popper's alternative, non-justificationist epistemology might be applicable to AGI. Any thoughts on that subject? (I won't presume to educate you about Popper; I recall that you studied Philosophy of Science and so should know all about him.)

-gts







-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to