Entropy

Regarding:
> So I see it that you confirm to Shannon´s interpretation of entropy as
actually being information <
Well, in essence we may agree, but I would call this an unfortunate choice
of words. “Information," I think, has come to mean so many things to so
many people that it is *nearly* a useless term. Even though I use this term
myself, I try to minimize its use. I would say that I agree with Shannon’s
view of signal entropy as a *type* of information – and then extend that
concept using type theory, to include “meaningful” roles. Only when taken
as a whole does “information” exist, within my framing.

S: It has been shocking to me that many info-tech persons use the word
'information' when what they mean is Shannon's 'information carrying
capacity' or the word 'entropy' when they mean Shannon's 'informational
entropy', referring to variety.

STAN


On Wed, Jun 22, 2016 at 5:41 AM, Marcus Abundis <55m...@gmail.com> wrote:

> In an online exchange, Annette raises a few points and questions that I
> summarize below.
> ===
> > Please give me your basic definition of entropy <
> My short answer is that I define entropy as "material variation" of any
> type, as clarified in paper #2 and detailed starting on page 5 (actually
> named on page 6). This definition is admittedly generic/vague, for a few
> reasons:
>   a) many "types of meaning" (or entropy, if you wish) must be framed and
> then joined. I name three minimum types of meaning/entropy in paper #2.
> This multitude requires that a generic term first be named if an UTI is to
> be developed (point 1 in the introductory text).
>   b) noise is itself informational in a Darwinian role as “demise.” I
> believe this departs from most informational notions, where noise is seen
> as the opposite of information. This view also accommodates an inverse,
> where one eventually “makes sense” of nominally chaotic events.
> As such, I name an existential ground Generic Entropy, and the “tendency
> to symmetric dispersal,” within that ground, “material entropy.” And so,
> “material variation” (of any type) is meant to capture the entirety of
> those entropic roles. Lastly, I find the notion of “pure symmetry” a useful
> scientific fiction, but still a fiction in the context of true empiric
> models (point 8 in the introductory text).
>
> > So the different entropies you are using in your video point to
> different options to organize <
> > elements in a way that they generate recognizable (and therefore to a
> degree similar) <
> > information out of those elements? <
> Here, *options* and *recognizable* are the key terms. As you note, I am
> using a different (novel?) notion of “entropy,” beyond even the novel way
> in which Shannon did, and thus (hopefully) extend Shannon's view. The most
> reductive aspect here is “the element” (i.e., a “fulcrum,“ a “load,” a
> “bit,” etc. [re paper #4]). Then, inter-RELATED *element sets,* depending
> on their order (innate functioning or dis-functioning), convey a specific
> role (signal or noise). It is this RELATING of singular elements (there are
> many *options*) that conveys specific meaning/functioning/logic/order. This
> meaningful relating can equally convey *recognizable* “types of order” or
> “types disorder” (e.g., many “types of screws” exist, each with unique
> functional advantages and disadvantages, or uses and mis-uses – a machine
> screw works poorly in wood). Finally, a “recognizably deformed screw” (a
> use-less *option*) must also be accounted for within this continuum. This
> notion of related data echoes the idea already noted in the exchange with
> Antonio.
>
> > So I see it that you confirm to Shannon´s interpretation of entropy as
> actually being information <
> Well, in essence we may agree, but I would call this an unfortunate choice
> of words. “Information," I think, has come to mean so many things to so
> many people that it is *nearly* a useless term. Even though I use this term
> myself, I try to minimize its use. I would say that I agree with Shannon’s
> view of signal entropy as a *type* of information – and then extend that
> concept using type theory, to include “meaningful” roles. Only when taken
> as a whole does “information” exist, within my framing.
>
> Also, the notion of stability (as necessary for meaning) you emphasize I
> find helpful but also limiting. I tend to think of  *everything* as pro tem
> except for perhaps the Standard Model and the Periodic Table (addressed in
> paper #2; Lee Smolin may disagree?). Within type theory the central
> question becomes “At what point/level(s) does material variation
> (“entropy”) break down or fail?”, and how and why does it fail? For me,
> this is a more useful way of viewing things – using stability is too much
> like The Denial of Death (Ernest Becker). This requires us to look beyond
> thermodynamics for answers. I believe thermodynamics is historically
> stressed as it is the closest we have to a “hard science” that we might
> ascribe to “information” (outside of Shannon, of course). We could easily
> spend hours discussing this I think . . . . but we are essentially on the
> same page.
>
> Lastly, thanks for sending me Shu-Kun Lin’s paper (The Nature of the
> Chemical Process. 1. Symmetry Evolution – Revised Information Theory,
> Similarity Principle and Ugly Symmetry). It looks to have some useful
> “pokes,” and I will hungrily dig into it when I have a chance.
>
> Thanks for your questions.
>
> Marcus
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to