Leot remarks:

"... we need a kind of calculus of redundancy."

I agree whole-heartedly.

What for Shannon was the key to error-correction is thus implicitly
normative. But of course assessment of normativity (accurate/inacurate,
useful/unuseful, significant/insignificant) must necessarily involve an
"outside" perspective, i.e. more than merely the statistics of sign medium
chartacteristics. Redundancy is also implicit in concepts like
communication, shared understanding, iconism, and Fano's "mutual
information." But notice too that redundancy is precisely non-information
in a strictly statistical understanding of that concept; a redundant
message is not itself "news" — and yet it can reduce the uncertainty of
what is "message" and what is "noise." It is my intuition that by
developing a formalization (e.g. a "calculus") using the complemetary
notions of redundancy and constraint that we will ultimately be able
formulate a route from Shannon to the higher-order conceptions of
information, in which referential and normative features can be precisely
formulated.

There is an open door, though it still seems pretty dark on the other side.
So one must risk stumbling in order to explore that space.

Happy 2017, Terry

On Sat, Jan 7, 2017 at 9:02 AM, John Collier <colli...@ukzn.ac.za> wrote:

> Dear List,
>
>
>
> I agree with Terry that we should not be bound by our own partial
> theories. We need an integrated view of information that shows its
> relations in all of its various forms. There is a family resemblance in the
> ways it is used, and some sort of taxonomy can be constructed. I recommend
> that of Luciano Floridi. His approach is not unified (unlike my own,
> reported on this list), but compatible with it, and is a place to start,
> though it needs expansion and perhaps modification. There may be some
> unifying concept of information, but its application to all the various
> ways it has been used will not be obvious, and a sufficiently general
> formulation my well seem trivial, especially to those interested in the
> vital communicative and meaningful aspects of information. I also agree
> with Loet that pessimism, however justified, is not the real problem. To
> some extent it is a matter of maturity, which takes both time and
> development, not to mention giving up cherished juvenile enthusiasms.
>
>
>
> I might add that constructivism, with its positivist underpinnings, tends
> to lead to nominalism and relativism about whatever is out there. I believe
> that this is a major hindrance to a unified understanding. I understand
> that it appeared in reaction to an overzealous and simplistic realism about
> science and other areas, but I think it through the baby out with the
> bathwater.
>
>
>
> I have been really ill, so my lack of communication. I am pleased to see
> this discussion, which is necessary for the field to develop maturity. I
> thought I should add my bit, and with everyone a Happy New Year, with all
> its possibilities.
>
>
>
> Warmest regards to everyone,
>
> John
>
>
>
> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Loet
> Leydesdorff
> *Sent:* December 31, 2016 12:16 AM
> *To:* 'Terrence W. DEACON' <dea...@berkeley.edu>; 'Dai Griffiths' <
> dai.griffith...@gmail.com>; 'Foundations of Information Science
> Information Science' <fis@listas.unizar.es>
>
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> We agree that such a theory is a ways off, though you some are far more
> pessimisitic about its possibility than me. I believe that we would do best
> to focus on the hole that needs filling in rather than assuming that it is
> an unfillable given.
>
>
>
> Dear Terrence and colleagues,
>
>
>
> It is not a matter of pessimism. We have the example of “General Systems
> Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one
> realized the biological metaphor driving it. In my opinion, we have become
> reflexively skeptical about claims of “generality” because we know the
> statements are framed within paradigms. Translations are needed in this
> fractional manifold.
>
>
>
> I agree that we are moving in a fruitful direction. Your book “Incomplete
> Nature” and “The Symbolic Species” have been important. The failing options
> cannot be observed, but have to be constructed culturally, that is, in
> discourse. It seems to me that we need a kind of calculus of redundancy.
> Perspectives which are reflexively aware of this need and do not assume an
> unproblematic “given” or “natural” are perhaps to be privileged
> nonetheless. The unobservbable options have first to be specified and we
> need theory (hypotheses) for this. Perhaps, this epistemological privilege
> can be used as a vantage point.
>
>
>
> There is an interesting relation to Husserl’s *Critique of the European
> Sciences* (1935): The failing (or forgotten) dimension is grounded in
> “intersubjective intentionality.” Nowadays, we would call this “discourse”.
> How are discourses structured and how can they be translated for the
> purpose of offering this “foundation”?
>
>
>
> Happy New Year,
>
> Loet
>
>
>
> My modest suggestion is only that in the absence of a unifying theory we
> should not privilege one partial theory over others and that in the absence
> of a global general theory we need to find terminology that clearly
> identifies the level at which the concept is being used. Lacking this, we
> end up debating incompatible definitions, and defending our favored one
> that either excludes or includes issues of reference and significance or
> else assumes or denies the relevance of human interpreters. With different
> participants interested in different levels and applications of the
> information concept—from physics, to computation, to neuroscience, to
> biosemiotics, to language, to art, etc.—failure to mark this diversity will
> inevitably lead us in circles.
>
>
>
> I urge humility with precision and an eye toward synthesis.
>
>
>
> Happy new year to all.\
>
>
>
> — Terry
>
>
>
> On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths <dai.griffith...@gmail.com>
> wrote:
>
> Thanks Stan,
>
> Yes, it's a powerful and useful process.
>
> My problem is that in this list, and in other places were such matters are
> discussed, we don't seem to be able to agree on the big picture, and the
> higher up the generalisations we go, the less we agree.
>
> I'd like to keep open the possibility that we might be yoking ideas
> together which it may be more useful to keep apart. We are dealing with
> messy concepts in messy configurations, which may not always map neatly
> onto a generalisation model.
>
> Dai
>
>
>
> On 22/12/16 16:45, Stanley N Salthe wrote:
>
> Dai --
>
> {phenomenon 1}
>
> {phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}
>
> {phenomenon 3}
>
> The process from left to right is generalization.
>
> ‘Information’ IS a generalization.
>
> generalities form the substance of philosophy. Info happens to a case
>
>      of generalization which can be mathematized, which in turn allows
>
>      it to be generalized even more.
>
> So, what’s the problem?
>
> STAN
>
>
>
> On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths <dai.griffith...@gmail.com>
> wrote:
>
> >  Information is not “something out there” which “exists” otherwise than
> as our construct.
>
> I agree with this. And I wonder to what extent our problems in discussing
> information come from our desire to shoe-horn many different phenomena into
> the same construct. It would be possible to disaggregate the construct. It
> be possible to discuss the topics which we address on this list without
> using the word 'information'. We could discuss redundancy, variety,
> constraint, meaning, structural coupling, coordination, expectation,
> language, etc.
>
> In what ways would our explanations be weakened?
>
> In what ways might we gain in clarity?
>
> If we were to go down this road, we would face the danger that our
> discussions might become (even more) remote from everyday human experience.
> But many scientific discussions are remote from everyday human experience.
>
> Dai
>
> On 20/12/16 08:26, Loet Leydesdorff wrote:
>
> Dear colleagues,
>
>
>
> A distribution contains uncertainty that can be measured in terms of bits
> of information.
>
> Alternatively: the expected information content *H *of a probability
> distribution is .
>
> *H* is further defined as probabilistic entropy using Gibb’s formulation
> of the entropy .
>
>
>
> This definition of information is an operational definition. In my
> opinion, we do not need an essentialistic definition by answering the
> question of “what is information?” As the discussion on this list
> demonstrates, one does not easily agree on an essential answer; one can
> answer the question “how is information defined?” Information is not
> “something out there” which “exists” otherwise than as our construct.
>
>
>
> Using essentialistic definitions, the discussion tends not to move
> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
> of information “as natural selection assembling the very constraints on the
> release of energy that then constitutes work and the propagation of
> organization.” I asked several times what this means and how one can
> measure this information. Hitherto, I only obtained the answer that
> colleagues who disagree with me will be cited. J Another answer was that
> “counting” may lead to populism. J
>
>
>
> Best,
>
> Loet
>
>
> ------------------------------
>
> Loet Leydesdorff
>
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
> Sussex;
>
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>
> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
> London;
>
> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>
>
>
> *From:* Dick Stoute [mailto:dick.sto...@gmail.com <dick.sto...@gmail.com>]
>
> *Sent:* Monday, December 19, 2016 12:48 PM
> *To:* l...@leydesdorff.net
> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> List,
>
>
>
> Please allow me to respond to Loet about the definition of information
> stated below.
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
>
>
> I agree.  I struggled with this definition for a long time before
> realising that Shannon was really discussing "amount of information" or the
> number of bits needed to convey a message.  He was looking for a formula
> that would provide an accurate estimate of the number of bits needed to
> convey a message and realised that the amount of information (number of
> bits) needed to convey a message was dependent on the "amount" of
> uncertainty that had to be eliminated and so he equated these.
>
>
>
> It makes sense to do this, but we must distinguish between "amount of
> information" and "information".  For example, we can measure amount of
> water in liters, but this does not tell us what water is and likewise the
> measure we use for "amount of information" does not tell us what
> information is. We can, for example equate the amount of water needed to
> fill a container with the volume of the container, but we should not think
> that water is therefore identical to an empty volume.  Similarly we should
> not think that information is identical to uncertainty.
>
>
>
> By equating the number of bits needed to convey a message with the "amount
> of uncertainty" that has to be eliminated Shannon, in effect, equated
> opposites so that he could get an estimate of the number of bits needed to
> eliminate the uncertainty.  We should not therefore consider that this
> equation establishes what information is.
>
>
>
> Dick
>
>
>
>
>
> On 18 December 2016 at 15:05, Loet Leydesdorff <l...@leydesdorff.net>
> wrote:
>
> Dear James and colleagues,
>
>
>
> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
> contribution:
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
> 2. "In particular, information must not be confused with meaning." (p. 8)
>
>
>
> The definition of information as relevant for a system of reference
> confuses information with "meaningful information" and thus sacrifices the
> surplus value of Shannon's counter-intuitive definition.
>
>
>
> information observer
>
>
>
> that integrates interactive processes such as
>
>
>
> physical interactions such photons stimulating the retina of the eye,
> human-machine interactions (this is the level that Shannon lives on),
> biological interaction such body temperature relative to touch ice or heat
> source, social interaction such as this forum started by Pedro, economic
> interaction such as the stock market, ... [Lerner, page 1].
>
>
>
> We are in need of a theory of meaning. Otherwise, one cannot measure
> meaningful information. In a previous series of communications we discussed
> redundancy from this perspective.
>
>
>
> Lerner introduces mathematical expectation E[Sap] (difference between of a
> priory entropy [sic] and a posteriori entropy), which is distinguished from
> the notion of relative information Iap (Learner, page 7).
>
>
>
> ) expresses in bits of information the information generated when the a
> priori distribution is turned into the a posteriori one . This follows
> within the Shannon framework without needing an observer. I use this
> equation, for example, in my 1995-book *The Challenge of Scientometrics*
> (Chapters 8 and 9), with a reference to Theil (1972). The relative
> information is defined as the *H*/*H*(max).
>
>
>
> I agree that the intuitive notion of information is derived from the Latin
> “in-formare” (Varela, 1979). But most of us do no longer use “force” and
> “mass” in the intuitive (Aristotelian) sense. J The proliferation of the
> meanings of information if confused with “meaningful information” is
> indicative for an “index sui et falsi”, in my opinion. The repetitive
> discussion lames the progression at this list. It is “like asking whether a
> glass is half empty or half full” (Hayles, 1990, p. 59).
>
>
>
> This act of forming forming an information process results in the
> construction of an observer that is the owner [holder] of information.
>
>
>
> The system of reference is then no longer the message, but the observer
> who provides meaning to the information (uncertainty). I agree that this is
> a selection process, but the variation first has to be specified
> independently (before it can be selected.
>
>
>
> And Lerner introduces the threshold between objective and subjective
> observes (page 27).   This leads to a consideration selection and
> cooperation that includes entanglement.
>
>
>
> I don’t see a direct relation between information and entanglement. An
> observer can be entangled.
>
>
>
> Best,
>
> Loet
>
>
>
> PS. Pedro: Let me assume that this is my second posting in the week which
> ends tonight. L.
>
>
>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
>
>
>
> --
>
>
> 4 Austin Dr. Prior Park St. James, Barbados BB23004
> Tel:   246-421-8855 <%28246%29%20421-8855>
> Cell:  246-243-5938 <%28246%29%20243-5938>
>
>
>
> _______________________________________________
>
> Fis mailing list
>
> Fis@listas.unizar.es
>
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
> --
>
> -----------------------------------------
>
>
>
> Professor David (Dai) Griffiths
>
> Professor of Education
>
> School of Education and Psychology
>
> The University of Bolton
>
> Deane Road
>
> Bolton, BL3 5AB
>
>
>
> Office: T3 02
>
> http://www.bolton.ac.uk/IEC
>
>
>
> SKYPE: daigriffiths
>
> UK Mobile +44 (0)7491151559 <+44%207491%20151559>
>
> Spanish Mobile: + 34 687955912 <+34%20687%2095%2059%2012>
>
> Work: + 44 (0)7826917705 <+44%207826%20917705>
>
> (Please don't leave voicemail)
>
> email:
>
>    d.e.griffi...@bolton.ac.uk
>
>    dai.griffith...@gmail.com
>
> _______________________________________________ Fis mailing list
> Fis@listas.unizar.es http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
> _______________________________________________
>
> Fis mailing list
>
> Fis@listas.unizar.es
>
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
> --
>
> -----------------------------------------
>
>
>
> Professor David (Dai) Griffiths
>
> Professor of Education
>
> School of Education and Psychology
>
> The University of Bolton
>
> Deane Road
>
> Bolton, BL3 5AB
>
>
>
> Office: T3 02
>
> http://www.bolton.ac.uk/IEC
>
>
>
> SKYPE: daigriffiths
>
> UK Mobile +44 (0)7491151559 <+44%207491%20151559>
>
> Spanish Mobile: + 34 687955912 <+34%20687%2095%2059%2012>
>
> Work: + 44 (0)7826917705 <+44%207826%20917705>
>
> (Please don't leave voicemail)
>
> email:
>
>    d.e.griffi...@bolton.ac.uk
>
>    dai.griffith...@gmail.com
>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
>
>
>
> --
>
> Professor Terrence W. Deacon
> University of California, Berkeley
>



-- 
Professor Terrence W. Deacon
University of California, Berkeley
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to