Prof. Jerry , List

On Thu, Apr 9, 2015 at 8:12 PM, Jerry LR Chandler
<[email protected]> wrote:
> In my view, the open question is the physical generality of this
> re-arranging of concepts.
>
> How does one use such a conceptual re-arrangement of symbols to the
> predictions of chemical structures and living structures?


My view is that in most cases, we do not use explicit types to fix the
interpretation of the mathematical representation.

Logics are abstracted from such mathematical representations - this
causes the logics to refer to processes in more general semantic
domains. The motivation may be seen to be in the study of abstract
parts of the original chemical processes. To predict chemical
structures, we need to supplement the logic further.

In situation logics, types are used to fix interpretations and here
the associations with reality are more rigid.

In formalism relating to safety critical systems, the associations are
even more rigid


In contrast to this entropy which has concrete processes associated,
the classical Shanon information entropy of coding (that  measures
degree of equality of proportions) is more abstract.

Shanon's entropy in relation to coding theory has a class of
generalization that can be formulated as optimization problems. These
are well suited for semiotics because the problem is very concrete and
the mathematical part is like: Entropy is the minimum value of the
problem: Minimize a "mean" codeword length subject to constraints
(like Kraft's inequality) - the meaning of constraints is not straight
forward though.

A more complete approach would be see Shanon's entropy as a process
along with all the encoding and decoding - I am not understanding why
people interested in semiotics should be concerned with plain formulas
alone.

The directed divergence interpretation relates to another class of
generalizations. Here we look at the directed divergences of a
probability   distn P from the uniform one - the entropies are
monotone decreasing functions of a measure of directed convergence of
P from U.
Example distance of P from distribution Q is \sum_{1}^{n} p_i log
(\frac{p_i}{q_i})

Arguably many of these fit into the more common axiomatic way of
defining entropies with associated **intuitions** that we have been
discussing.

A sum of a measure of the first two classes of measures is also seen
as a measure of inaccuracy.


In contrast to all of the above, the idea of entropy as a measure of
roughness is not properly grounded. In rough set theory, we are
concerned with approximating concepts on the Attribute-Value model.
Each object has associated attributes and values for those. E.g Value
"High" for attribute "Temperature"  may be associated with fever. In
my approach to rough sets, I avoid frequentism and concentrate on
semantics and comparison of semantics. So I don't use the frequentism
based ideas of entropies that lead to cumulation of errors. Instead I
use maps to represent  my semantic ideas of entropy in the rough
context.



> The nature of abstraction is one of the essential topics of CSP writings.
> In your terminology, what is the mathematical or logical origin of your 
> assertion that their exist a "scope of variation"?

It is about what is best understood, or easily observable or sometimes
about the computable.
Sometimes logics are developed relative paradigms like " wanting a
3-valued logic for a phenomena as it seems to be appropriate" or
sometimes it is about formalizing particular concepts to the best
admissible level.

In the thermodynamic entropy case, we have already seen two abstractions.






Regards

A. Mani



Prof(Miss) A. Mani
CU, ASL, AMS, ISRS, CLC, CMS
HomePage: http://www.logicamani.in
Blog: http://logicamani.blogspot.in/
http://about.me/logicamani
sip:[email protected]
-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to [email protected] . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to [email protected] with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to