Similarly btw, the attractors in a dynamical system only capture a
portion of the dynamics -- they, like emergent symbolic-dynamics
grammars, are also a layer of abstraction that ignores a lot of the
complexity that exists in the trajectories...

On Fri, Jul 3, 2020 at 6:46 PM Ben Goertzel <b...@goertzel.org> wrote:
> > So you found grammars which adequately summarize a symbolic dynamics for 
> > Cisco networks, and are still happy with the idea such generalizations will 
> > capture all the important behaviour? You don't think there are some 
> > behaviours of Cisco networks which are only explained at the network level?
> >
> > A pity. I was hoping a new, starker, contrast between networks and grammars 
> > would give you new insight. Because Cisco networks are not fundamentally 
> > generated by grammars. We know that. So we would not be surprised to find 
> > aspects which do not obey grammar rules. Or perhaps aspects where they flip 
> > from one grammatical abstraction to another.
>
>
> Of course the grammar rules are only an abstraction to the underlying
> chaotic dynamics... who would ever think otherwise??
>
>
> > But by contrast we don't know this for natural language. So we are 
> > constantly surprised when we find aspects of natural language grammar do 
> > not obey grammar rules, or flip from one grammatical abstraction to another.
> >
> > So the parallel exists there for an insight that natural language structure 
> > might also need to fundamentally be resolved at a network level. That the 
> > network level might be the best level to model its symbolic dynamics too. 
> > So we could flip from one grammatical abstraction to another, as necessary.
> >
> > If we gained this insight, it would no longer surprise us that the number 
> > of parameters necessary for an adequate grammar continues to explode. Of 
> > course it will. If natural language structure needs to be fundamentally 
> > modeled as a network, then the network will be the smallest representation 
> > for it. "Grammar" will be a production, which might generate essentially 
> > infinite "parameters". GPTx will continue to just get bigger and bigger and 
> > bigger, or resolve poorly, or both.
>
>
> I feel like you're making a sort of "straw man" argument here
>
> I have never thought nor claimed that formal grammars (of the sort we
> learn in grammar induction experiments, or that appear in linguistic
> textbooks, etc.) summarize the totality of interesting or relevant
> linguistic pattern
>
> Only that they are a highly important and cognitively and practically
> useful level of abstraction --- which are part of (not all of) the
> dynamics of linguistic comprehension and production in human minds
> (and human-like AI minds)
>
> BTW models like GPT3 have nothing to do with grammar of course ...
> they do model language as a lower-level network... just IMO the wrong
> kind of lower-level network (I think we agree that recurrent
> chaotically-dynamic networks have a major role to play in cognitive
> linguistics, but we seem to disagree on whether formal grammars also
> have a major role, I think they do...)
>
> ben



-- 
Ben Goertzel, PhD
http://goertzel.org

“The only people for me are the mad ones, the ones who are mad to
live, mad to talk, mad to be saved, desirous of everything at the same
time, the ones who never yawn or say a commonplace thing, but burn,
burn, burn like fabulous yellow roman candles exploding like spiders
across the stars.” -- Jack Kerouac

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T100f708e32ae7327-M9f7b3382c0183de3ca87d7d7
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to