A lot of biological information isn't even instantiated in neuronal
activity, it's in one's "gut" metaphorically speaking.
I seems to me that a common mistake in idealism is to take consciousness
as the whole of thought. Yet we know that (c.f. Poincare') most thought
is unconscious information processing.
Brent
On 2/21/2026 2:16 AM, Alastair wrote:
Most of this is fascinating, insightful and deep - from what I can
understand of Parts I to IV. (I am wondering: did you have more than
cosmetic help from AI?)
I would also be interested to know your definition of 'information'
(as bitstrings or equivalent? or as their chosen interpretation? or
something else?). Semantic imprecision can be a barrier to adequate
understanding and agreement in these (and many other) kinds of
situation, so good definitions are important.
My own preferred version of physicalism has thought events as mass
neural events and so can include ideas, concepts etc, including
thoughts in and about a language, any of which could in theory be
correct or incorrect (the physical laws underpinning those events
operate correctly regardless). It would not appear to fall foul of any
of the criticisms in part I of the article if these are framed outside
the context of information as being ontologically primary; ie from
this point of view physicalism is self-consistent, in this version of
it at least, and so contradicts the assertion that ontologically
primary information is the only self-consistent position available.
We may well have already detected electrical signals corresponding to
thoughts and could even one day decode them, if we can for example
individualise them to key neurons or assemblies and then bulk-analyse
them across macro-time; but I don't understand sufficiently to say
whether or not this this would refute the idea that information is
ontologically primary - this brings us back to the definition of
information used, and perhaps also to that of 'computational structures'.
Alastair
On Sunday, February 15, 2026 at 8:52:30 AM UTC Quentin Anciaux wrote:
Hello everyone,
I’m sharing the continuation of The Sapiens Attractor.
If you’re interested in the deeper structure behind the idea, you
can read it here:
https://allcolor.medium.com/the-sapiens-attractor-maximal-informational-realism-and-the-god-loop-26393e34fa46
Hope you’ll enjoy it.
Best,
Quentin
All those moments will be lost in time, like tears in rain. (Roy
Batty/Rutger Hauer)
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to [email protected].
To view this discussion visit
https://groups.google.com/d/msgid/everything-list/4bffdfb3-cd5f-4211-9b82-d001637573c3n%40googlegroups.com
<https://groups.google.com/d/msgid/everything-list/4bffdfb3-cd5f-4211-9b82-d001637573c3n%40googlegroups.com?utm_medium=email&utm_source=footer>.
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion visit
https://groups.google.com/d/msgid/everything-list/30492f60-f803-4268-afef-25668419e2d6%40gmail.com.