Thanks Stan,

Yes, it's a powerful and useful process.

My problem is that in this list, and in other places were such matters are discussed, we don't seem to be able to agree on the big picture, and the higher up the generalisations we go, the less we agree.

I'd like to keep open the possibility that we might be yoking ideas together which it may be more useful to keep apart. We are dealing with messy concepts in messy configurations, which may not always map neatly onto a generalisation model.

Dai


On 22/12/16 16:45, Stanley N Salthe wrote:

Dai --

{phenomenon 1}

{phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}

{phenomenon 3}

The process from left to right is generalization.

‘Information’ IS a generalization.

generalities form the substance of philosophy. Info happens to a case

     of generalization which can be mathematized, which in turn allows

     it to be generalized even more.

So, what’s the problem?

STAN


On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths <dai.griffith...@gmail.com <mailto:dai.griffith...@gmail.com>> wrote:

    >  Information is not “something out there” which “exists”
    otherwise than as our construct.

    I agree with this. And I wonder to what extent our problems in
    discussing information come from our desire to shoe-horn many
    different phenomena into the same construct. It would be possible
    to disaggregate the construct. It be possible to discuss the
    topics which we address on this list without using the word
    'information'. We could discuss redundancy, variety, constraint,
    meaning, structural coupling, coordination, expectation, language,
    etc.

    In what ways would our explanations be weakened?

    In what ways might we gain in clarity?

    If we were to go down this road, we would face the danger that our
    discussions might become (even more) remote from everyday human
    experience. But many scientific discussions are remote from
    everyday human experience.

    Dai

    On 20/12/16 08:26, Loet Leydesdorff wrote:

    Dear colleagues,

    A distribution contains uncertainty that can be measured in terms
    of bits of information.

    Alternatively: the expected information content /H /of a
    probability distribution is .

    /H/is further defined as probabilistic entropy using Gibb’s
    formulation of the entropy .

    This definition of information is an operational definition. In
    my opinion, we do not need an essentialistic definition by
    answering the question of “what is information?” As the
    discussion on this list demonstrates, one does not easily agree
    on an essential answer; one can answer the question “how is
    information defined?” Information is not “something out there”
    which “exists” otherwise than as our construct.

    Using essentialistic definitions, the discussion tends not to
    move forward. For example, Stuart Kauffman’s and Bob Logan’s
    (2007) definition of information “as natural selection assembling
    the very constraints on the release of energy that then
    constitutes work and the propagation of organization.” I asked
    several times what this means and how one can measure this
    information. Hitherto, I only obtained the answer that colleagues
    who disagree with me will be cited. JAnother answer was that
    “counting” may lead to populism. J

    Best,

    Loet

    ------------------------------------------------------------------------

    Loet Leydesdorff

    Professor, University of Amsterdam
    Amsterdam School of Communication Research (ASCoR)

    <mailto:l...@leydesdorff.net>l...@leydesdorff.net
    <mailto:l...@leydesdorff.net> ;
    <http://www.leydesdorff.net/>http://www.leydesdorff.net/
    Associate Faculty, SPRU,
    <http://www.sussex.ac.uk/spru/>University of Sussex;

    Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
    Hangzhou; Visiting Professor, ISTIC,
    <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;

    Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University
    of London;

    
<http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en>http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
    <http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en>

    *From:*Dick Stoute [mailto:dick.sto...@gmail.com]
    *Sent:* Monday, December 19, 2016 12:48 PM
    *To:* l...@leydesdorff.net <mailto:l...@leydesdorff.net>
    *Cc:* James Peters; u...@umces.edu <mailto:u...@umces.edu>; Alex
    Hankey; FIS Webinar
    *Subject:* Re: [Fis] What is information? and What is life?

    List,

    Please allow me to respond to Loet about the definition of
    information stated below.

    1. the definition of information as uncertainty is
    counter-intuitive ("bizarre"); (p. 27)

    I agree.  I struggled with this definition for a long time before
    realising that Shannon was really discussing "amount of
information" or the number of bits needed to convey a message. He was looking for a formula that would provide an accurate
    estimate of the number of bits needed to convey a message and
    realised that the amount of information (number of bits) needed
    to convey a message was dependent on the "amount" of uncertainty
    that had to be eliminated and so he equated these.

    It makes sense to do this, but we must distinguish between
    "amount of information" and "information".  For example, we can
    measure amount of water in liters, but this does not tell us what
    water is and likewise the measure we use for "amount of
    information" does not tell us what information is. We can, for
    example equate the amount of water needed to fill a container
    with the volume of the container, but we should not think that
    water is therefore identical to an empty volume. Similarly we
    should not think that information is identical to uncertainty.

    By equating the number of bits needed to convey a message with
    the "amount of uncertainty" that has to be eliminated Shannon, in
    effect, equated opposites so that he could get an estimate of the
    number of bits needed to eliminate the uncertainty.  We should
    not therefore consider that this equation establishes what
    information is.

    Dick

    On 18 December 2016 at 15:05, Loet Leydesdorff
    <l...@leydesdorff.net <mailto:l...@leydesdorff.net>> wrote:

    Dear James and colleagues,

    Weaver (1949) made two major remarks about his coauthor
    (Shannon)'s contribution:

    1. the definition of information as uncertainty is
    counter-intuitive ("bizarre"); (p. 27)

    2. "In particular, information must not be confused with
    meaning." (p. 8)

    The definition of information as relevant for a system of
    reference confuses information with "meaningful information" and
    thus sacrifices the surplus value of Shannon's counter-intuitive
    definition.

    information observer

    that integrates interactive processes such as

    physical interactions such photons stimulating the retina of the
    eye, human-machine interactions (this is the level that Shannon
    lives on), biological interaction such body temperature relative
    to touch ice or heat source, social interaction such as this
    forum started by Pedro, economic interaction such as the stock
    market, ... [Lerner, page 1].

    We are in need of a theory of meaning. Otherwise, one cannot
    measure meaningful information. In a previous series of
    communications we discussed redundancy from this perspective.

    Lerner introduces mathematical expectation E[Sap] (difference
    between of a priory entropy [sic] and a posteriori entropy),
    which is distinguished from the notion of relative information
    Iap (Learner, page 7).

    ) expresses in bits of information the information generated when
    the a priori distribution is turned into the a posteriori one .
    This follows within the Shannon framework without needing an
    observer. I use this equation, for example, in my 1995-book /The
    Challenge of Scientometrics/ (Chapters 8 and 9), with a reference
    to Theil (1972). The relative information is defined as the
    /H///H/(max).

    I agree that the intuitive notion of information is derived from
    the Latin “in-formare” (Varela, 1979). But most of us do no
    longer use “force” and “mass” in the intuitive (Aristotelian)
    sense. JThe proliferation of the meanings of information if
    confused with “meaningful information” is indicative for an
    “index sui et falsi”, in my opinion. The repetitive discussion
    lames the progression at this list. It is “like asking whether a
    glass is half empty or half full” (Hayles, 1990, p. 59).

    This act of forming forming an information process results in the
    construction of an observer that is the owner [holder] of
    information.

    The system of reference is then no longer the message, but the
    observer who provides meaning to the information (uncertainty). I
    agree that this is a selection process, but the variation first
    has to be specified independently (before it can be selected.

    And Lerner introduces the threshold between objective and
    subjective observes (page 27).   This leads to a consideration
    selection and cooperation that includes entanglement.

    I don’t see a direct relation between information and
    entanglement. An observer can be entangled.

    Best,

    Loet

    PS. Pedro: Let me assume that this is my second posting in the
    week which ends tonight. L.


    _______________________________________________
    Fis mailing list
    Fis@listas.unizar.es <mailto:Fis@listas.unizar.es>
    http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
    <http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>



--

    4 Austin Dr. Prior Park St. James, Barbados BB23004
    Tel: 246-421-8855 <tel:%28246%29%20421-8855>
    Cell: 246-243-5938 <tel:%28246%29%20243-5938>



    _______________________________________________
    Fis mailing list
    Fis@listas.unizar.es <mailto:Fis@listas.unizar.es>
    http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
    <http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>

-- -----------------------------------------

    Professor David (Dai) Griffiths
    Professor of Education
    School of Education and Psychology
    The University of Bolton
    Deane Road
    Bolton, BL3 5AB

    Office: T3 02
    http://www.bolton.ac.uk/IEC

    SKYPE: daigriffiths
    UK Mobile+44 (0)7491151559 <tel:+44%207491%20151559>
    Spanish Mobile:+ 34 687955912 <tel:+34%20687%2095%2059%2012>
Work:+ 44 (0)7826917705 <tel:+44%207826%20917705> (Please don't leave voicemail)
    email:
d.e.griffi...@bolton.ac.uk <mailto:d.e.griffi...@bolton.ac.uk> dai.griffith...@gmail.com <mailto:dai.griffith...@gmail.com>

    _______________________________________________ Fis mailing list
    Fis@listas.unizar.es <mailto:Fis@listas.unizar.es>
    http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
<http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
--
-----------------------------------------

Professor David (Dai) Griffiths
Professor of Education
School of Education and Psychology
The University of Bolton
Deane Road
Bolton, BL3 5AB

Office: T3 02
http://www.bolton.ac.uk/IEC

SKYPE: daigriffiths
UK Mobile +44 (0)7491151559
Spanish Mobile: + 34 687955912
Work: + 44 (0)7826917705
(Please don't leave voicemail)
email:
   d.e.griffi...@bolton.ac.uk
   dai.griffith...@gmail.com
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to