On Sat, Apr 4, 2015 at 6:06 PM, Jon Awbrey <jawb...@att.net> wrote:
> From a mathematical point of view, an "entropy" or "uncertainty" measure is
> simply a measure on distributions that achieves its maximum when the
> distribution is uniform. It is thus a measure of dispersion or uniformity.
>
> Measures like these can be applied to distributions that arise in any given
> domain of phenomena, in which case they have various specialized meanings
> and implications.
>
> When it comes to applications in communication and inquiry, the information
> of a sign or message is measured by its power to reduce uncertainty.
>
> The following essay may be useful to some listers:
>
> http://intersci.ss.uci.edu/wiki/index.php/Semiotic_Information


Adding to the discussion


"entropy" has been extended to neighbourhood systems, granulations
with the intent of capturing roughness and information uncertainty in
rough set theory. There are extensions to fuzzy sets as well. These
measures essentially contribute to specific perspectives of
understanding the ontology of information semantics relative the
systems

The measures implicitly assume a frequentist position - the
probabilistic connections are not good enough. When fuzzy granulations
are used, then the interpretation (by analogy with probabilistic
idealisation) breaks down further.





Regards

A. Mani



Prof(Miss) A. Mani
CU, ASL, AMS, ISRS, CLC, CMS
HomePage: http://www.logicamani.in
Blog: http://logicamani.blogspot.in/
http://about.me/logicamani
sip:girlprofes...@ekiga.net
-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to