Dorian,

Ah, well I'm sympathetic to an approach looking at this as a problem of
dynamics (though I'm not sure what "memory-energy" coupling could mean.)

Colin Hales, who I think has corresponded here, has an electrodynamic field
model which would seem to be more strictly embodied. He says it must be
implemented as electrodynamic fields. Lee Cronin seems to have a similar
argument, but from a chemistry perspective. George Lakoff, embodiment...
basically as neurons, anyway. (The whole embodiment thing became something
in the fields of linguistics which retained a basis in data after Chomsky:
Functional, Cognitive Linguistics, and indeed generally in the "corpus
based" fields most directly connected with machine learning. And quite
rightly. They saw something. Insisting on basis in a "corpus" is an
embodied form of non-compression. "Corpus" = body. Embodiment is a form of
the non-compressibility argument, as I say. You need the whole corpus, said
Corpus Linguistics. This as opposed to Chomsky, who argued language could
be... not compressed... the whole point became that it could not be
compressed, that's why Chomsky still rejects machine learning. Chomsky
insisted any abstraction must be innate, exactly because it could not be
compressed/learned. At one level, observable compressions contradicted. So
linguistics resolved to embodied, or innate. But then LLMs ignored
linguistics and went and "learned" over corpora anyway!)

But you're not trapped by the embodiment interpretation of this. Good. Let
alone Chomsky's "unlearnable" innate structure. You see a solution in
dynamics. Good. (Once a dynamical system becomes chaotic, it does become
embodied in a sense, but chaos is not limited to one embodiment.)

So what are the parameters of this dynamics? You  say "certain
energy-structured conditions (e.g., coherence, memory capacity, signal
velocity) are necessary".

"Coherence" I'm sympathetic to.

I've been pushing the idea of a dynamical system solution parameterized by
the "coherence" of oscillations. Driven by network symmetries of
prediction. A dynamical system parametrized by similarity of context,
anyway. A dynamical system in the sense the patterns grow and change.
Actually I think its evolution is probably chaotic on some level.
Concurring with Walter Freeman on that, from the neuroscience field.

Checking your link, you try memristors.

"emergent, quantized intelligence, analogous to a phase transition" sounds
good. Phase transitions being a big theme of Walter Freeman.

"adaptive behavior arises intrinsically from the physics of the substrate".
Yes.

Personally I think the crucial parameters are the ones already identified
by LLMs: shared context or shared prediction. If you can harness dynamics
which depend on those, it doesn't matter what your substrate is. I got
focused on the potential of (neural activation) oscillations in a network
of neurons representing language sequences. Just because synchronized
oscillations struck me as dynamics to capture those parameters of shared
context/prediction.

>From a casual glance at your paper I'm unable to tell if the parameters of
your dynamics are also shared context/prediction. If they are, then it
might be good.

I feel you might be looking at static attractors in some sense though. Some
kind of "bubble memory". Which then has meaning... how? I can't find the
word "meaning" anywhere in your paper. Instead you have "adaptive,
feedback-driven reconfiguration". So it seems you make something of the
novelty of reconfiguration, but this has value only because of "feedback".
So the system will give meaning to these reconfigurations by some kind of
feedback from the environment?

That sounds to me like George Edelman's Neural Darwinism. Endless random
reconfigurations, which are selected for meaning by the environment (like
his Nobel Prize winning immune system insight.)

Of interest to me, I recently came across Eugene Izhikevich's
"polychronizations". Stable sequences appearing (from the dynamics)
spontaneously in networks of neurons. But Izhikevich didn't attribute
meaning to these based on the way the sequences shared contexts either.

It looks to me like you have the insights about dynamical systems, and the
power of reordering/reconfiguration. But (like Edelman and Izhikevich) you
may be missing shared context/prediction, as the key parameter. Making
"meaning" internal to the system too, and not needing to be imposed by any
external feedback.

-R

On Mon, Sep 1, 2025 at 2:23 AM Dorian Aur <[email protected]> wrote:

>   Electrodynamic Intelligence doesn’t argue that learning is tied to a
> specific biological embodiment or that there’s only one valid substrate.
> What it does argue is that *learning occurs from the physical dynamics of
> signal propagation and memory-energy coupling,*  not from abstract
> optimization rules. This isn't an anti-compression or anti-symbolic stance,
> it is a move toward *physically embedded, self-organizing systems *that
> learn because of how they are built, not because of what we train them to
> do.
> We agree that “can’t be compressed” doesn’t imply “must be embodied in one
> way.” EDI doesn’t claim there's only one valid realization of learning or
> intelligence. Rather, it highlights that certain energy-structured
> conditions <https://zenodo.org/records/16997063>(e.g., coherence, memory
> capacity, signal velocity) are necessary for physically grounded
> intelligence to occur..
> Propagation-driven learning is not about memorizing complexity but about
> allowing local physical dynamics to shape the system's functional structure
> over time. Compression is secondary, what's primary is whether the system
> can self-organize adaptively through physical interaction to generate
> self-learning and active memory consolidation.
>
> --- Dorian Aur
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta9b77fda597cc07a-M2ce345fbb62815fcb868ce18
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to