Hi Richard,

I don't really want to get too sidetracked, but even if Immerman's
analysis were correct, would this make a difference to the way that Eric
was using NP-Hard, though?

No, Immerman's perspective on complexity classes doesn't really affect
your objections...

Firstly, the descriptive logic depiction of complexity classes is
**still** about what happens as n gets large.

So it doesn't affect one of the key objections that both you and Pei
have to using concepts from computational complexity theory to analyze
AGI: which is that AGI systems don't have to deal with general classes
of problems of problem size tending to infinity, they have to deal
with **particular** problems of bounded size.  For instance, if an AGI
is good at learning **human** language, it may not matter how its
language learning capability scales when dealing with languages
falling into the same grammar category as human language whose
grammars have sizes tending to infinity.  If an AGI is good at solving
path-finding problems in real life, it may not matter how its
worst-case path-finding capability scales when dealing with paths
between n cities as n tends to infinity....  Etc.  In fact there are
decent qualitative arguments that most of the algorithms used by human
cognition (insofar as it makes sense to say that human cognition uses
"algorithms", which is another issue, as Pei has noted) are
**exponential time** in terms of their scaling as problem size
approaches infinity ... but the point is that they are tuned to give
tractable performance for the problem-instances that humans generally
encounter in real life...

Secondly, Immerman's analysis doesn't affect the fact that the
formalization of "language learning" referred to by Eric Baum is only
tenuously related to the actual cognitive phenomenon of human language
learning.

On the other hand, Immerman's analysis does **suggest** (not
demonstrate) that there could be some cognitive meaningfulness to the
classes P and NP.

For instance, if someone were to show that the learning of languages
in the same general category as human natural languages ("natural-like
languages")...

-- can be naturally represented using existential second-order logic

but

-- cannot be naturally represented using first-order logic with recursion

this would be interesting, and would match up naturally with the
observation that "natural-like language" learning is NP but not P.

On the other hand, this kind of analysis would only be really
cognitively meaningful in the context of an explanation of how this
formalization of language learning is related to actual cognitive
language learning....  I happen to think that such an explanation
**could* be formulated; but no one has really done so, so far.  That
is, no one has given a formalization encompassing the embodied, social
semantics and pragmatics of language learning (as discussed e.g. in
Tomassello's excellent recent book "Constructing a Language"); and in
the absence of such a formalization, formal discussions of "grammar
learning" are not convincingly connected to real cognitive language
learning.

-- Ben

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to