Loet – thank you for this quote; I was thinking of almost the same, but
enlarged:

“The concept of information developed in this theory at first seems
disappointing and bizarre—
• disappointing because it has nothing to do with meaning, and
• bizarre because it deals not with a single message [/meaning] but rather
with the statistical character of a whole ensemble of messages [/meanings],
• bizarre also because in these statistical terms the two words information
and uncertainty find themselves to be [rather paradoxical] partners.

“I think, however, that these should be only temporary reactions; [**as
opposed to] . . .
I think, however, that this analysis has so penetratingly cleared the air
that one is now,
perhaps for the first time,
• ready for a real theory of meaning."
(Weaver, 1949, p. 27)
**Temporary reactions that have lasted nearly 70 years.

To expand . . . I first faulted Shannon and his regular use of
“transmitting information” in his 1948 paper as inciting the above issues.
But I dug deeper and saw that Shannon probably labored against his own Bell
Labs “cultural legacy” (1920’s), sustained by Nyqvist (“intelligence” in
telegraphic transmissions) and Hartley (Transmission of Information), and
which likely drove Shannon’s use of terms.
And then, we have Weaver pushing even harder against that 1920’s legacy –
“. . . ready for a real theory of meaning.”
anticipating a 21st century demand/hurdle.

So, are we “ready for a real theory of meaning”?

As we pursue this session it is worth seeing that we also labor against a
cultural legacy. Without debating further the notion of Shannon Signal
Entropy – we can agree the concept works so well, and on so many levels
(channel, coding, compression, correction, etc.) that it is hard to track
all of its successes. Continued gains 70 years on sustain their own legacy,
along with the lingering ghosts of Nyqvist and Hartely. But that sharp
success paired with a glaring “meaningful void” also creates a disturbing
opposition. In attempting an UTI, Shannon got us half-way there – the truly
universal QUANTITATIVE aspect he mapped is indisputable. We now need only
complete the second half of that journey by mapping truly QUALITATIVE
aspects . . . (sure, no problem!)

So, are we “ready for a real theory of meaning”?

On the matter of “readiness,” Stanley points to (I paraphrase) a need for
precise terms if we are to exorcise ourselves of our ghosts and set ground
for a meaning-full resolution. Easily agreed . . .
But Stanley – when I look to define *information entropy* I see: “tells us
the quantity of information in a . . . .” We both sense what is meant here,
but using “information” in any implied more-primitive role, when trying to
define “information,” keeps already muddied waters “circling.” The point of
a priori analysis is to find *truly* more-primitive terms. Thus, I prefer
Signal Entropy to informational entropy. I give much attention to my use of
terms in this work, but I am sure I fall short in my own ambitions – so I
ask that you point out my own weaknesses as I seek a more-precise/primitive
framing.

Bob – your view of “constrained and un(lack)constrained” echoes the
“information and uncertainty” paradox (“• bizarre also . . .”) noted by
Weaver; but “extended to multiple dimensions . . . of truly complex events
and structures,” past what Shannon anticipated. Again, easily agreed; you
also speak of confusion in our notions of entropy – yes!
I then looked at your FISPAP.pdf “. . . going so far as to apprehend
proto-meaning.”

But your paper seems to emphasize QUANTITIVE aspects, and I have a hard
time seeing QUALITATIVE informational aspects. Naming “conditional entropy”
(entropic sub-qualities, two components?) UNDER a Shannon “umbrella” to
assert proto-meaning seems slightly off (okay, *some* qualities are
considered). But there is a deeper issue here where quantitative signal
entropy is inextricably entangled (confused?) with any qualitative meaning
– one is impossible without the other, arguing for a dual material aspect
(no?). In the video, I explain how I view this entropic dual aspect via
delta S and delta O – seeking to minimize innate confusion. Also, the video
names various classes (types) of entropy to expand our notions of “entropy”
and “meaning.” I looked for similar structures in FISPAP.pdf, but did not
see any – did I miss them? Or perhaps am I fundamentally wrong in my dual
aspect (quant-qual) thinking, or?

Also, earlier you ask us to
> Consider the "snow" pattern on a TV without a signal. Its Shannon
> measure is much higher than when a picture appears onscreen, yet we know
that the snow
> pattern carries no information.

I am tempted to accept this on its face, but in the sense of *being ready*
for something else . . . surely “snow” tells me something. At the least it
tells me that a TV is switched <ON> – this is information, no? Or, if I am
a “READY cryptographer,” that snow tells me of a deeply encrypted signal
that requires further action to reveal a meaning. If I am a “not-quite
READY” Penzias and Wilson, that snow tells me about pigeon droppings on my
antenna, instead of marking vast cosmic background radiation. Or, if I am
an “accidentally READY” Roentgen, the odd glowing snow that I happen see
out of the corner of my eye reveals x-rays. “Happy scientific accidents”
are meaning-full events, no? But in a purely Shannon or thermodynamic based
view, snow is just snow, and always is just snow – where does “proto” come
from, if not from effective-or-ineffective interpretation? If I
misinterpret what your paper suggests, I would appreciate your corrections.

My thanks to you both for sharing your thoughts!

Marcus
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to