[Fis] What is information? and What is life?

2016-12-26 Thread Christophe
Dear Loet,
You nicely illustrate the problem as a “hole“ in the center of the various 
perspectives. All these current and futures perspectives are indeed needed but 
it is true that “a general theory of information” remains terrribly 
challenging, precisely due to the sometimes orthogonal perspectives of the 
different theories, as you say.
Now,  perhaps the “hole” can be used as a image leading us far back in time 
when our universe was only about matter and energy. The evolution of our 
universe could then be used as a reference frame for the history of information.
Such time guided background can be used for all the various perspectives and 
also highlights pitfalls like the mysterious natures of life and human mind.
This brings us to take life as a starting point for the being of meaningful 
information (as said, information should not be separated from meaning. Weaver 
rightly recomended not to confuse meaning with information. It is not about 
separating them).
So we could begin by positioning our investigations between life and human mind 
to address the natures of information and meaning, which are realities at that 
level and can there be modeled in quite simple terms.
Then, being carefull with human mind, we could go to human management of 
information and consider human acheivements and current works: the measurement 
of quantity (channel capacity, Shannon), the formalizations (physical, 
referential, normative, syntactic, semantic, pragmatic, constraint satisfaction 
oriented,  your communcation/sharing of meaning or information, ...).
This does not really fill the “hole” but it brings in evolution as a thread 
which leads to start with the simplest task.
Wishing you and all FISers the best for this year end and for the coming 2017.
Christophe


De : Fis  de la part de Loet Leydesdorff 

Envoyé : lundi 26 décembre 2016 14:01
À : 'Terrence W. DEACON'; 'Francesco Rizzo'; 'fis'
Objet : Re: [Fis] What is information? and What is life?


In this respect Loet comments:



"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."



We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon.



Dear Terrence and colleagues,



The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question.



In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.



I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract:

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolution with the computation of anticip

Re: [Fis] What is information? and What is life?

2016-12-18 Thread Loet Leydesdorff
Dear James and colleagues, 

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s
contribution:

 

1. the definition of information as uncertainty is counter-intuitive
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8) 

 

The definition of information as relevant for a system of reference confuses
information with "meaningful information" and thus sacrifices the surplus
value of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as 

 

physical interactions such photons stimulating the retina of the eye,
human-machine interactions (this is the level that Shannon lives on),
biological interaction such body temperature relative to touch ice or heat
source, social interaction such as this forum started by Pedro, economic
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure
meaningful information. In a previous series of communications we discussed
redundancy from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a
priory entropy [sic] and a posteriori entropy), which is distinguished from
the notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a
priori distribution is turned into the a posteriori one . This follows
within the Shannon framework without needing an observer. I use this
equation, for example, in my 1995-book The Challenge of Scientometrics
(Chapters 8 and 9), with a reference to Theil (1972). The relative
information is defined as the H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin
"in-formare" (Varela, 1979). But most of us do no longer use "force" and
"mass" in the intuitive (Aristotelian) sense. J The proliferation of the
meanings of information if confused with "meaningful information" is
indicative for an "index sui et falsi", in my opinion. The repetitive
discussion lames the progression at this list. It is "like asking whether a
glass is half empty or half full" (Hayles, 1990, p. 59). 

 

This act of forming forming an information process results in the
construction of an observer that is the owner [holder] of information.

 

The system of reference is then no longer the message, but the observer who
provides meaning to the information (uncertainty). I agree that this is a
selection process, but the variation first has to be specified independently
(before it can be selected.

 

And Lerner introduces the threshold between objective and subjective
observes (page 27).   This leads to a consideration selection and
cooperation that includes entanglement.

 

I don't see a direct relation between information and entanglement. An
observer can be entangled.

 

Best, 

Loet

 

PS. Pedro: Let me assume that this is my second posting in the week which
ends tonight. L.

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-19 Thread Dick Stoute
List,

Please allow me to respond to Loet about the definition of information
stated below.

1. the definition of information as uncertainty is counter-intuitive
("bizarre"); (p. 27)



I agree.  I struggled with this definition for a long time before realising
that Shannon was really discussing "amount of information" or the number of
bits needed to convey a message.  He was looking for a formula that would
provide an accurate estimate of the number of bits needed to convey a
message and realised that the amount of information (number of bits) needed
to convey a message was dependent on the "amount" of uncertainty that had
to be eliminated and so he equated these.


It makes sense to do this, but we must distinguish between "amount of
information" and "information".  For example, we can measure amount of
water in liters, but this does not tell us what water is and likewise the
measure we use for "amount of information" does not tell us what
information is. We can, for example equate the amount of water needed to
fill a container with the volume of the container, but we should not think
that water is therefore identical to an empty volume.  Similarly we should
not think that information is identical to uncertainty.


By equating the number of bits needed to convey a message with the "amount
of uncertainty" that has to be eliminated Shannon, in effect, equated
opposites so that he could get an estimate of the number of bits needed to
eliminate the uncertainty.  We should not therefore consider that this
equation establishes what information is.


Dick


On 18 December 2016 at 15:05, Loet Leydesdorff  wrote:

> Dear James and colleagues,
>
>
>
> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
> contribution:
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
> 2. "In particular, information must not be confused with meaning." (p. 8)
>
>
>
> The definition of information as relevant for a system of reference
> confuses information with "meaningful information" and thus sacrifices the
> surplus value of Shannon's counter-intuitive definition.
>
>
>
> information observer
>
>
>
> that integrates interactive processes such as
>
>
>
> physical interactions such photons stimulating the retina of the eye,
> human-machine interactions (this is the level that Shannon lives on),
> biological interaction such body temperature relative to touch ice or heat
> source, social interaction such as this forum started by Pedro, economic
> interaction such as the stock market, ... [Lerner, page 1].
>
>
>
> We are in need of a theory of meaning. Otherwise, one cannot measure
> meaningful information. In a previous series of communications we discussed
> redundancy from this perspective.
>
>
>
> Lerner introduces mathematical expectation E[Sap] (difference between of a
> priory entropy [sic] and a posteriori entropy), which is distinguished from
> the notion of relative information Iap (Learner, page 7).
>
>
>
> ) expresses in bits of information the information generated when the a
> priori distribution is turned into the a posteriori one . This follows
> within the Shannon framework without needing an observer. I use this
> equation, for example, in my 1995-book *The Challenge of Scientometrics*
> (Chapters 8 and 9), with a reference to Theil (1972). The relative
> information is defined as the *H*/*H*(max).
>
>
>
> I agree that the intuitive notion of information is derived from the Latin
> “in-formare” (Varela, 1979). But most of us do no longer use “force” and
> “mass” in the intuitive (Aristotelian) sense. J The proliferation of the
> meanings of information if confused with “meaningful information” is
> indicative for an “index sui et falsi”, in my opinion. The repetitive
> discussion lames the progression at this list. It is “like asking whether a
> glass is half empty or half full” (Hayles, 1990, p. 59).
>
>
>
> This act of forming forming an information process results in the
> construction of an observer that is the owner [holder] of information.
>
>
>
> The system of reference is then no longer the message, but the observer
> who provides meaning to the information (uncertainty). I agree that this is
> a selection process, but the variation first has to be specified
> independently (before it can be selected.
>
>
>
> And Lerner introduces the threshold between objective and subjective
> observes (page 27).   This leads to a consideration selection and
> cooperation that includes entanglement.
>
>
>
> I don’t see a direct relation between information and entanglement. An
> observer can be entangled.
>
>
>
> Best,
>
> Loet
>
>
>
> PS. Pedro: Let me assume that this is my second posting in the week which
> ends tonight. L.
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 

4 Austin Dr. Prior Park St. James, Barbados BB23004
Tel:   246-421-8

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Karl Javorszky
What is Information?



Once more, Occam and the numbers give a simple, short and concise
explanation. (There is more text and a formal definition of information in
my book “Natural Orders” ISBN: 9783990571378.)



The root of the term “information” is in the concept of order. The idea of
order can be assumed axiomatic for people who are interested in the
definition of information. For those who need a deictic definition of the
term “order”: take n (n > 3) objects. One can use teddy-bears, shoes,
pieces of paper, numbers, whatever. We sort the objects. The sequence that
arrives after we have ordered the objects is the deictic definition of the
term “order”. This might sound in deviance to the use of the term in
mathematics; to reconcile the two concepts, we point out that the
traditional definition in mathematics refers to order as a potential,
realisable property of the collection, while here we speak of order as a
realised instance of the general faculty of the objects to be in order. The
distinction is always clear from the context. A collection that shows a
sequence of its elements is an ordered collection.



>From the order to the information:

Whichever order exists, it has alternatives and a background. The
alternatives are those variants of the order which are not realised, the
background is that state of the world about which we cannot say anything
definite.

To repeat:

N distinguishable objects have n! possible permutations. We take one
specific of the permutations. In this permutation, a1 is on place p1. The
alternatives are those permutations, where a1 is not on place p1. The
background are those permutations where a2 a3, etc… can be on places px,
py, pz, etc….



Novelty:

We construct also such logical sentences which state something that is not
the case. These are false logical statements. Traditionally, one does not
use logical sentences that are false. This is not a rule given by a Supreme
Logician from Heaven, but a convention of convenience. It would have been
inconceivable to write up all those results of a false multiplication table
that are not correct, besides the conceptual aversion against doing so.

Now we have computers. These can register all that what is not the case, as
long as we restrict ourselves to using rather few logical words while we
utter a logical sentence. The brambling of an infant can be of a high
scientific value, in case one wants to learn, how the achievement of
language progresses during infancy. We now do not care, whether the
sentence is logically true or not, as long as it is grammatically correct.
We simply write up all possible sentences that a child can express. Among
these, there are such in which the child states something correctly (e.g.
recognises and names a toy), such where the child names an alternative
(e.g. calls a doll a ball) and such which are the background, neither
surely true, nor surely false (e.g. the child calls something a cluxtli and
we do not know what the child has been looking at in that moment). We
investigate all three aspects of the order: the actual sequence, its
excluded alternatives and the background to these.



Resume:

Traditionally, tertium non datur, therefore that what is not the case is
defined simply as .not. .t. = .false. Now we have a more complicated logic,
and .unknown. is also permitted.

After the unknown has softened up the trivial and lazy definition of “what
is not true is false and we do not speak false”, there is a need for a word
to describe that what is not true, but is a background to that what is
true. Against this background, one may recognise the shadow of what is the
case: this is what is definitely not the case.



Information:

Information is that what we do know AND do not know about the state of the
world. (Footnote to “The world is everything that is the case”: “The
alternatives to that what is the case are described by sentences about that
what can not be the case and the background to that what is the case is
described by sentences that state that what can be the case”. And to “About
that what is not the case, one should keep one’s silence” the following:
“unless and until one has found a way not to think with one’s own brain”.)

Traditionally, that what is not the case has been seen as one solid logical
entity, defined by – well, by that that it is not the case, as a contrast
to that what reasonable people can speak reasonably about. Computers allow
us to slice thinly between layers of what is not the case. That what oozes
out is information.

The relation among that what is the case, that what this implicates and
that what is not affected by that what is the case needs some new words to
allow precision in thinking. One of the words that are available is
“information”. Scientists have always supposed that there is something
hidden, yet obviously at work, behind symbols, bits and logical statements.
Forcing this oyster open shows its inner life. The interdependence between
what can be the c

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Bob Logan
Dear Dick - I loved your analysis. You are right on the money. It also explains 
why Shannon dominated the field of information. He had a mathematical formula 
and there is nothing more appealing to a scientist than a mathematical formula. 
But you are right his formula only tells us of how many bits are needed to 
represent some information but tells us nothing about its meaning or its 
significance. As Marshall McLuhan said about Shannon information it is figure 
without ground. A figure only acquires meaning when one understands the ground 
in which it operates. So Shannon’s contribution to engineering is excellent but 
it tells us nothing about its nature or its impact as you wisely pointed out. 
Thanks for your insight.

I would like to refer to your insight the next time I write about info and want 
to attribute you correctly. Can you’ll me a bit about yourself like where you 
do your research. thanks - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications


On Dec 19, 2016, at 6:48 AM, Dick Stoute  wrote:

List,

Please allow me to respond to Loet about the definition of information stated 
below.  

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

Dick


On 18 December 2016 at 15:05, Loet Leydesdorff mailto:l...@leydesdorff.net>> wrote:
Dear James and colleagues,

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8)

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows within the 
Shannon framework without needing an observer. I use this equation, for 
example, in my 1995-book The Challenge of Scientometrics (Chapters 8 and 9), 
with a reference to Theil (1972). The relative information is defined as the 
H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin 
“in-formare” (Varela, 1979). But most of us do no longer use “force” and “mass” 
in the intuitive (Aristotelian) sense. J The proliferation of the meanings of 
information if confused with “meaningful information” is indicative for an 
“

Re: [Fis] What is information? and What is life?

2016-12-20 Thread Loet Leydesdorff
Dear colleagues, 

 

A distribution contains uncertainty that can be measured in terms of bits of 
information.

Alternatively: the expected information content H of a probability distribution 
is .

H is further defined as probabilistic entropy using Gibb’s formulation of the 
entropy .

 

This definition of information is an operational definition. In my opinion, we 
do not need an essentialistic definition by answering the question of “what is 
information?” As the discussion on this list demonstrates, one does not easily 
agree on an essential answer; one can answer the question “how is information 
defined?” Information is not “something out there” which “exists” otherwise 
than as our construct.

 

Using essentialistic definitions, the discussion tends not to move forward. For 
example, Stuart Kauffman’s and Bob Logan’s (2007) definition of information “as 
natural selection assembling the very constraints on the release of energy that 
then constitutes work and the propagation of organization.” I asked several 
times what this means and how one can measure this information. Hitherto, I 
only obtained the answer that colleagues who disagree with me will be cited. J 
Another answer was that “counting” may lead to populism. J 

 

Best,

Loet

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;  
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

 <http://scholar.google.com/citations?user=ych9gNYJ&hl=en> 
http://scholar.google.com/citations?user=ych9gNYJ&hl=en

 

From: Dick Stoute [mailto:dick.sto...@gmail.com] 
Sent: Monday, December 19, 2016 12:48 PM
To: l...@leydesdorff.net
Cc: James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
Subject: Re: [Fis] What is information? and What is life?

 

List,

 

Please allow me to respond to Loet about the definition of information stated 
below.  

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

 

I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

 

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

 

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

 

Dick

 

 

On 18 December 2016 at 15:05, Loet Leydesdorff  wrote:

Dear James and colleagues, 

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8) 

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as 

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Oth

Re: [Fis] What is information? and What is life?

2016-12-20 Thread Bob Logan
Loet - thanks for the mention of our (Kauffman, Logan et al) definition our 
definition of information which is a qualitative description of information. As 
to whether one can measure information with our description, my response is no 
but I am not sure that one can measure information at all. What units would one 
use to measure information? E = mc 2 contains a lot of information but the 
amount of information depends on context. A McLuhan one-liner such as 'the 
medium is the message' also contains a lot of information even though it is 
only 5 words or 26 characters long. 

Hopefully I have provided some information but how much information is 
impossible to measure.

Bob




__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan/












On Dec 20, 2016, at 3:26 AM, Loet Leydesdorff  wrote:

Dear colleagues, 
 
A distribution contains uncertainty that can be measured in terms of bits of 
information.
Alternatively: the expected information content H of a probability distribution 
is .
H is further defined as probabilistic entropy using Gibb’s formulation of the 
entropy .
 
This definition of information is an operational definition. In my opinion, we 
do not need an essentialistic definition by answering the question of “what is 
information?” As the discussion on this list demonstrates, one does not easily 
agree on an essential answer; one can answer the question “how is information 
defined?” Information is not “something out there” which “exists” otherwise 
than as our construct.
 
Using essentialistic definitions, the discussion tends not to move forward. For 
example, Stuart Kauffman’s and Bob Logan’s (2007) definition of information “as 
natural selection assembling the very constraints on the release of energy that 
then constitutes work and the propagation of organization.” I asked several 
times what this means and how one can measure this information. Hitherto, I 
only obtained the answer that colleagues who disagree with me will be cited. J 
Another answer was that “counting” may lead to populism. J
 
Best,
Loet
 
Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net  <mailto:l...@leydesdorff.net>; 
http://www.leydesdorff.net/ <http://www.leydesdorff.net/> 
Associate Faculty, SPRU,  <http://www.sussex.ac.uk/spru/>University of Sussex; 
Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, Hangzhou; 
Visiting Professor, ISTIC,  <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
Visiting Professor,  <>Birkbeck <http://www.bbk.ac.uk/>, University of London; 
http://scholar.google.com/citations?user=ych9gNYJ&hl=en 
<http://scholar.google.com/citations?user=ych9gNYJ&hl=en>
 
From: Dick Stoute [mailto:dick.sto...@gmail.com <mailto:dick.sto...@gmail.com>] 
Sent: Monday, December 19, 2016 12:48 PM
To: l...@leydesdorff.net <mailto:l...@leydesdorff.net>
Cc: James Peters; u...@umces.edu <mailto:u...@umces.edu>; Alex Hankey; FIS 
Webinar
Subject: Re: [Fis] What is information? and What is life?
 
List,
 
Please allow me to respond to Loet about the definition of information stated 
below.  
 
1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  
 
It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.
 
By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefo

Re: [Fis] What is information? and What is life?

2016-12-20 Thread Mark Johnson
election assembling the very constraints on the
> release of energy that then constitutes work and the propagation of
> organization.” I asked several times what this means and how one can
> measure this information. Hitherto, I only obtained the answer that
> colleagues who disagree with me will be cited. J Another answer was that
> “counting” may lead to populism. J
>
> Best,
> Loet
>
> --
> Loet Leydesdorff
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
> l...@leydesdorff.net  ; http://www.leydesdorff.net/
> Associate Faculty, SPRU,  <http://www.sussex.ac.uk/spru/>University of
> Sussex;
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
> London;
> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>
> *From:* Dick Stoute [mailto:dick.sto...@gmail.com ]
>
> *Sent:* Monday, December 19, 2016 12:48 PM
> *To:* l...@leydesdorff.net
> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
> *Subject:* Re: [Fis] What is information? and What is life?
>
> List,
>
> Please allow me to respond to Loet about the definition of information
> stated below.
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
>
>
> I agree.  I struggled with this definition for a long time before
> realising that Shannon was really discussing "amount of information" or the
> number of bits needed to convey a message.  He was looking for a formula
> that would provide an accurate estimate of the number of bits needed to
> convey a message and realised that the amount of information (number of
> bits) needed to convey a message was dependent on the "amount" of
> uncertainty that had to be eliminated and so he equated these.
>
>
>
> It makes sense to do this, but we must distinguish between "amount of
> information" and "information".  For example, we can measure amount of
> water in liters, but this does not tell us what water is and likewise the
> measure we use for "amount of information" does not tell us what
> information is. We can, for example equate the amount of water needed to
> fill a container with the volume of the container, but we should not think
> that water is therefore identical to an empty volume.  Similarly we should
> not think that information is identical to uncertainty.
>
>
>
> By equating the number of bits needed to convey a message with the "amount
> of uncertainty" that has to be eliminated Shannon, in effect, equated
> opposites so that he could get an estimate of the number of bits needed to
> eliminate the uncertainty.  We should not therefore consider that this
> equation establishes what information is.
>
>
>
> Dick
>
>
> On 18 December 2016 at 15:05, Loet Leydesdorff 
> wrote:
>
> Dear James and colleagues,
>
>
>
> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
> contribution:
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
> 2. "In particular, information must not be confused with meaning." (p. 8)
>
>
>
> The definition of information as relevant for a system of reference
> confuses information with "meaningful information" and thus sacrifices the
> surplus value of Shannon's counter-intuitive definition.
>
>
>
> information observer
>
>
>
> that integrates interactive processes such as
>
>
>
> physical interactions such photons stimulating the retina of the eye,
> human-machine interactions (this is the level that Shannon lives on),
> biological interaction such body temperature relative to touch ice or heat
> source, social interaction such as this forum started by Pedro, economic
> interaction such as the stock market, ... [Lerner, page 1].
>
>
>
> We are in need of a theory of meaning. Otherwise, one cannot measure
> meaningful information. In a previous series of communications we discussed
> redundancy from this perspective.
>
>
>
> Lerner introduces mathematical expectation E[Sap] (difference between of a
> priory entropy [sic] and a posteriori entropy), which is distinguished from
> the notion of relative information Iap (Learner, page 7).
>
>
>
> ) expresses in bits of information the information generated when the a
> priori distribution is turned into the a posteriori one . This follows
> within

Re: [Fis] What is information? and What is life?

2016-12-22 Thread Dai Griffiths
>  Information is not “something out there” which “exists” otherwise 
than as our construct.


I agree with this. And I wonder to what extent our problems in 
discussing information come from our desire to shoe-horn many different 
phenomena into the same construct. It would be possible to disaggregate 
the construct. It be possible to discuss the topics which we address on 
this list without using the word 'information'. We could discuss 
redundancy, variety, constraint, meaning, structural coupling, 
coordination, expectation, language, etc.


In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our 
discussions might become (even more) remote from everyday human 
experience. But many scientific discussions are remote from everyday 
human experience.


Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms of 
bits of information.


Alternatively: the expected information content /H /of a probability 
distribution is .


/H/is further defined as probabilistic entropy using Gibb’s 
formulation of the entropy .


This definition of information is an operational definition. In my 
opinion, we do not need an essentialistic definition by answering the 
question of “what is information?” As the discussion on this list 
demonstrates, one does not easily agree on an essential answer; one 
can answer the question “how is information defined?” Information is 
not “something out there” which “exists” otherwise than as our construct.


Using essentialistic definitions, the discussion tends not to move 
forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) 
definition of information “as natural selection assembling the very 
constraints on the release of energy that then constitutes work and 
the propagation of organization.” I asked several times what this 
means and how one can measure this information. Hitherto, I only 
obtained the answer that colleagues who disagree with me will be 
cited. JAnother answer was that “counting” may lead to populism. J


Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net <mailto:l...@leydesdorff.net>; 
http://www.leydesdorff.net/
Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of 
Sussex;


Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, 
Hangzhou; Visiting Professor, ISTIC, 
<http://www.istic.ac.cn/Eng/brief_en.html>Beijing;


Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of 
London;


http://scholar.google.com/citations?user=ych9gNYJ&hl=en

*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net
*Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of information 
stated below.


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


I agree.  I struggled with this definition for a long time before 
realising that Shannon was really discussing "amount of information" 
or the number of bits needed to convey a message.  He was looking for 
a formula that would provide an accurate estimate of the number of 
bits needed to convey a message and realised that the amount of 
information (number of bits) needed to convey a message was dependent 
on the "amount" of uncertainty that had to be eliminated and so he 
equated these.


It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of 
water in liters, but this does not tell us what water is and likewise 
the measure we use for "amount of information" does not tell us what 
information is. We can, for example equate the amount of water needed 
to fill a container with the volume of the container, but we should 
not think that water is therefore identical to an empty volume.  
Similarly we should not think that information is identical to 
uncertainty.


By equating the number of bits needed to convey a message with the 
"amount of uncertainty" that has to be eliminated Shannon, in effect, 
equated opposites so that he could get an estimate of the number of 
bits needed to eliminate the uncertainty.  We should not therefore 
consider that this equation establishes what information is.


Dick

On 18 December 2016 at 15:05, Loet Leydesdorff <mailto:l...@leydesdorff.net>> wrote:


Dear James and colleagues,

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:


1. t

Re: [Fis] What is information? and What is life?

2016-12-22 Thread Terrence W. DEACON
tps://www.physics.utoronto.ca/people/homepages/logan/
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On Dec 20, 2016, at 3:26 AM, Loet Leydesdorff 
>> wrote:
>>
>> Dear colleagues,
>>
>> A distribution contains uncertainty that can be measured in terms of bits
>> of information.
>> Alternatively: the expected information content *H *of a probability
>> distribution is .
>> *H* is further defined as probabilistic entropy using Gibb’s formulation
>> of the entropy .
>>
>> This definition of information is an operational definition. In my
>> opinion, we do not need an essentialistic definition by answering the
>> question of “what is information?” As the discussion on this list
>> demonstrates, one does not easily agree on an essential answer; one can
>> answer the question “how is information defined?” Information is not
>> “something out there” which “exists” otherwise than as our construct.
>>
>> Using essentialistic definitions, the discussion tends not to move
>> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
>> of information “as natural selection assembling the very constraints on the
>> release of energy that then constitutes work and the propagation of
>> organization.” I asked several times what this means and how one can
>> measure this information. Hitherto, I only obtained the answer that
>> colleagues who disagree with me will be cited. J Another answer was that
>> “counting” may lead to populism. J
>>
>> Best,
>> Loet
>>
>> --
>> Loet Leydesdorff
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>> l...@leydesdorff.net  ; http://www.leydesdorff.net/
>>
>> Associate Faculty, SPRU,  <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>> London;
>> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>>
>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com 
>> ]
>> *Sent:* Monday, December 19, 2016 12:48 PM
>> *To:* l...@leydesdorff.net
>> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
>> *Subject:* Re: [Fis] What is information? and What is life?
>>
>> List,
>>
>> Please allow me to respond to Loet about the definition of information
>> stated below.
>>
>>
>> 1. the definition of information as uncertainty is counter-intuitive
>> ("bizarre"); (p. 27)
>>
>>
>>
>> I agree.  I struggled with this definition for a long time before
>> realising that Shannon was really discussing "amount of information" or the
>> number of bits needed to convey a message.  He was looking for a formula
>> that would provide an accurate estimate of the number of bits needed to
>> convey a message and realised that the amount of information (number of
>> bits) needed to convey a message was dependent on the "amount" of
>> uncertainty that had to be eliminated and so he equated these.
>>
>>
>>
>> It makes sense to do this, but we must distinguish between "amount of
>> information" and "information".  For example, we can measure amount of
>> water in liters, but this does not tell us what water is and likewise the
>> measure we use for "amount of information" does not tell us what
>> information is. We can, for example equate the amount of water needed to
>> fill a container with the volume of the container, but we should not think
>> that water is therefore identical to an empty volume.  Similarly we should
>> not think that information is identical to uncertainty.
>>
>>
>>
>> By equating the number of bits needed to convey a message with the
>> "amount of uncertainty" that has to be eliminated Shannon, in effect,
>> equated opposites so that he could get an estimate of the number of bits
>> needed to eliminate the uncertainty.  We should not therefore consider that
>> this equation establishes what information is.
>>
>>
>>
>> Dick
>>
>>
>> On 18 December 2016 at 15:05, Loet Leydesdorff 
>> wrote:
>>
>> Dear James and colleagues,
>>
>>
>>
>> Weaver (1949) made two major remarks about his coauthor (Shannon)&#

Re: [Fis] What is information? and What is life?

2016-12-22 Thread Stanley N Salthe
Dai --

{phenomenon 1}

{phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}

{phenomenon 3}

The process from left to right is generalization.

‘Information’ IS a generalization.

generalities form the substance of philosophy. Info happens to a case

 of generalization which can be mathematized, which in turn allows

 it to be generalized even more.

So, what’s the problem?

STAN

On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths 
wrote:

> >  Information is not “something out there” which “exists” otherwise than
> as our construct.
>
> I agree with this. And I wonder to what extent our problems in discussing
> information come from our desire to shoe-horn many different phenomena into
> the same construct. It would be possible to disaggregate the construct. It
> be possible to discuss the topics which we address on this list without
> using the word 'information'. We could discuss redundancy, variety,
> constraint, meaning, structural coupling, coordination, expectation,
> language, etc.
>
> In what ways would our explanations be weakened?
>
> In what ways might we gain in clarity?
>
> If we were to go down this road, we would face the danger that our
> discussions might become (even more) remote from everyday human experience.
> But many scientific discussions are remote from everyday human experience.
>
> Dai
> On 20/12/16 08:26, Loet Leydesdorff wrote:
>
> Dear colleagues,
>
>
>
> A distribution contains uncertainty that can be measured in terms of bits
> of information.
>
> Alternatively: the expected information content *H *of a probability
> distribution is .
>
> *H* is further defined as probabilistic entropy using Gibb’s formulation
> of the entropy .
>
>
>
> This definition of information is an operational definition. In my
> opinion, we do not need an essentialistic definition by answering the
> question of “what is information?” As the discussion on this list
> demonstrates, one does not easily agree on an essential answer; one can
> answer the question “how is information defined?” Information is not
> “something out there” which “exists” otherwise than as our construct.
>
>
>
> Using essentialistic definitions, the discussion tends not to move
> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
> of information “as natural selection assembling the very constraints on the
> release of energy that then constitutes work and the propagation of
> organization.” I asked several times what this means and how one can
> measure this information. Hitherto, I only obtained the answer that
> colleagues who disagree with me will be cited. J Another answer was that
> “counting” may lead to populism. J
>
>
>
> Best,
>
> Loet
>
>
> --
>
> Loet Leydesdorff
>
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; <http://www.leydesdorff.net/>
> http://www.leydesdorff.net/
> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
> Sussex;
>
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>
> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
> London;
>
> <http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en>
> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>
>
>
> *From:* Dick Stoute [mailto:dick.sto...@gmail.com ]
>
> *Sent:* Monday, December 19, 2016 12:48 PM
> *To:* l...@leydesdorff.net
> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> List,
>
>
>
> Please allow me to respond to Loet about the definition of information
> stated below.
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
>
>
> I agree.  I struggled with this definition for a long time before
> realising that Shannon was really discussing "amount of information" or the
> number of bits needed to convey a message.  He was looking for a formula
> that would provide an accurate estimate of the number of bits needed to
> convey a message and realised that the amount of information (number of
> bits) needed to convey a message was dependent on the "amount" of
> uncertainty that had to be eliminated and so he equated these.
>
>
>
> It makes sense to do this, but we must distinguish between "amount of
> information" and "information".  For example, we can m

Re: [Fis] What is information? and What is life?

2016-12-23 Thread Loet Leydesdorff
Dear Terrence and colleagues, 

 

I agree that we should not be fundamentalistic about “information”. For 
example, one can also use “uncertainty” as an alternative word to Shannon-type 
“information”. One can also make distinctions other than 
semantic/syntactic/pragmatic, such as biological information, etc.

 

Nevertheless, what makes this list to a common platform, in my opinion, is our 
interest in the differences and similarities in the background of these 
different notions of information. In my opinion, the status of Shannon’s 
mathematical theory of information is different  from special theories of 
information (e.g., biological ones) since the formal theory enables us to 
translate between these latter theories. The translations are heuristically 
important: they enable us to import metaphors from other backgrounds (e.g., 
auto-catalysis).

 

For example, one of us communicated with me why I was completely wrong, and 
made the argument with reference to Kullback-Leibler divergence between two 
probability distributions. Since we probably will not have “a general theory” 
of information, the apparatus in which information is formally and 
operationally defined—Bar-Hillel once called it “information calculus”—can 
carry this interdisciplinary function with precision and rigor. Otherwise, we 
can only be respectful of each other’s research traditions. J

 

I wish you all a splendid 2017,

Loet   

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;  
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

 <http://scholar.google.com/citations?user=ych9gNYJ&hl=en> 
http://scholar.google.com/citations?user=ych9gNYJ&hl=en

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Terrence W. DEACON
Sent: Thursday, December 22, 2016 5:33 AM
To: fis
Subject: Re: [Fis] What is information? and What is life?

 

Against information fundamentalism

 

Rather than fighting over THE definition of information, I suggest that we 
stand back from the polemics for a moment and recognize that the term is being 
used in often quite incompatible ways in different domains, and that there may 
be value in paying attention to the advantages and costs of each. To ignore 
these differences, to fail to explore the links and dependencies between them, 
and to be indifferent to the different use values gained or sacrificed by each, 
I believe that we end up undermining the very enterprise we claim to be 
promoting.

 

We currently lack broadly accepted terms to unambiguously distinguish these 
divergent uses and, even worse, we lack a theoretical framework for 
understanding their relationships to one another.

So provisionally I would argue that we at least need to distinguish three 
hierarchically related uses of the concept:

 

1. Physical information: Information as intrinsically measurable medium 
properties with respect to their capacity to support 2 or 3 irrespective of any 
specific instantiation of 2 or 3.

 

2. Referential information: information as a non-intrinsic relation to 
something other than medium properties (1) that a given medium can provide 
(i.e. reference or content) irrespective of any specific instantiation of 3.

 

3. Normative information: Information as the use value provided by a given 
referential relation (2) with respect to an end-directed dynamic that is 
susceptible to contextual factors that are not directly accessible (i.e. 
functional value or significance).

 

Unfortunately, because of the history of using the same term in an unmodified 
way in each relevant domain irrespective of the others there are often 
pointless arguments of a purely definitional nature.

 

In linguistic theory an analogous three-part hierarchic partitioning of theory 
IS widely accepted. 

 

1. syntax

2. semantics

3. pragmatics

 

Thus by analogy some have proposed the distinction between

 

1. syntactic information (aka Shannon)

2. semantic information (aka meaning)

3. pragmatic information (aka useful information)

 

This has also often been applied to the philosophy of information (e.g. see The 
Stanford Dictionary of Philosophy entry for ‘information’). Unfortunately, the 
language-centric framing of this distinction can be somewhat misleading. The 
metaphoric extension of the terms ‘syntax’ and ‘semantics’ to apply to iconic 
(e.g. pictorial) or indexical (e.g. correlational) forms of communication 
exerts a subtle procrustean influence that obscures their naturalistic and 
nondigital fea

Re: [Fis] What is information? and What is life?

2016-12-24 Thread Francesco Rizzo
Cari Tutti,
ho scritto più volte le stesse cose per cui sono d'accordo con Voi,
specialmente con gli ultimi intervenuti. E dato che sono un forestiero
rispetto alle Vostre discipline, ma non uno straniero dell'armonia del
sapere o del sapere dell'armonia, questo è una bella cosa. Auguri di buon
Natale e per il nuovo anno.
Francesco

2016-12-24 7:45 GMT+01:00 Loet Leydesdorff :

> Dear Terrence and colleagues,
>
>
>
> I agree that we should not be fundamentalistic about “information”. For
> example, one can also use “uncertainty” as an alternative word to
> Shannon-type “information”. One can also make distinctions other than
> semantic/syntactic/pragmatic, such as biological information, etc.
>
>
>
> Nevertheless, what makes this list to a common platform, in my opinion, is
> our interest in the differences and similarities in the background of these
> different notions of information. In my opinion, the status of Shannon’s
> mathematical theory of information is different  from special theories of
> information (e.g., biological ones) since the formal theory enables us to
> translate between these latter theories. The translations are heuristically
> important: they enable us to import metaphors from other backgrounds (e.g.,
> auto-catalysis).
>
>
>
> For example, one of us communicated with me why I was completely wrong,
> and made the argument with reference to Kullback-Leibler divergence between
> two probability distributions. Since we probably will not have “a general
> theory” of information, the apparatus in which information is formally and
> operationally defined—Bar-Hillel once called it “information calculus”—can
> carry this interdisciplinary function with precision and rigor. Otherwise,
> we can only be respectful of each other’s research traditions. J
>
>
>
> I wish you all a splendid 2017,
>
> Loet
>
>
> --
>
> Loet Leydesdorff
>
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
> Sussex;
>
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>
> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
> London;
>
> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>
>
>
> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Terrence
> W. DEACON
> *Sent:* Thursday, December 22, 2016 5:33 AM
> *To:* fis
>
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> Against information fundamentalism
>
>
>
> Rather than fighting over THE definition of information, I suggest that we
> stand back from the polemics for a moment and recognize that the term is
> being used in often quite incompatible ways in different domains, and that
> there may be value in paying attention to the advantages and costs of each.
> To ignore these differences, to fail to explore the links and dependencies
> between them, and to be indifferent to the different use values gained or
> sacrificed by each, I believe that we end up undermining the very
> enterprise we claim to be promoting.
>
>
>
> We currently lack broadly accepted terms to unambiguously distinguish
> these divergent uses and, even worse, we lack a theoretical framework for
> understanding their relationships to one another.
>
> So provisionally I would argue that we at least need to distinguish three
> hierarchically related uses of the concept:
>
>
>
> 1. Physical information: Information as intrinsically measurable medium
> properties with respect to their capacity to support 2 or 3 irrespective of
> any specific instantiation of 2 or 3.
>
>
>
> 2. Referential information: information as a non-intrinsic relation to
> something other than medium properties (1) that a given medium can provide
> (i.e. reference or content) irrespective of any specific instantiation of 3.
>
>
>
> 3. Normative information: Information as the use value provided by a given
> referential relation (2) with respect to an end-directed dynamic that is
> susceptible to contextual factors that are not directly accessible (i.e.
> functional value or significance).
>
>
>
> Unfortunately, because of the history of using the same term in an
> unmodified way in each relevant domain irrespective of the others there are
> often pointless arguments of a purely definitional nature.
>
>
>
> In linguistic theory an analogous three-part hierarchic partition

Re: [Fis] What is information? and What is life?

2016-12-24 Thread Terrence W. DEACON
important: they enable us to import metaphors from other
>> backgrounds (e.g., auto-catalysis).
>>
>>
>>
>> For example, one of us communicated with me why I was completely wrong,
>> and made the argument with reference to Kullback-Leibler divergence between
>> two probability distributions. Since we probably will not have “a general
>> theory” of information, the apparatus in which information is formally and
>> operationally defined—Bar-Hillel once called it “information calculus”—can
>> carry this interdisciplinary function with precision and rigor. Otherwise,
>> we can only be respectful of each other’s research traditions. J
>>
>>
>>
>> I wish you all a splendid 2017,
>>
>> Loet
>>
>>
>> --
>>
>> Loet Leydesdorff
>>
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>>
>> l...@leydesdorff.net ; http://www.leydesdorff.net/
>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>>
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>
>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>> London;
>>
>> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>>
>>
>>
>> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Terrence
>> W. DEACON
>> *Sent:* Thursday, December 22, 2016 5:33 AM
>> *To:* fis
>>
>> *Subject:* Re: [Fis] What is information? and What is life?
>>
>>
>>
>> Against information fundamentalism
>>
>>
>>
>> Rather than fighting over THE definition of information, I suggest that
>> we stand back from the polemics for a moment and recognize that the term is
>> being used in often quite incompatible ways in different domains, and that
>> there may be value in paying attention to the advantages and costs of each.
>> To ignore these differences, to fail to explore the links and dependencies
>> between them, and to be indifferent to the different use values gained or
>> sacrificed by each, I believe that we end up undermining the very
>> enterprise we claim to be promoting.
>>
>>
>>
>> We currently lack broadly accepted terms to unambiguously distinguish
>> these divergent uses and, even worse, we lack a theoretical framework for
>> understanding their relationships to one another.
>>
>> So provisionally I would argue that we at least need to distinguish three
>> hierarchically related uses of the concept:
>>
>>
>>
>> 1. Physical information: Information as intrinsically measurable medium
>> properties with respect to their capacity to support 2 or 3 irrespective of
>> any specific instantiation of 2 or 3.
>>
>>
>>
>> 2. Referential information: information as a non-intrinsic relation to
>> something other than medium properties (1) that a given medium can provide
>> (i.e. reference or content) irrespective of any specific instantiation of 3.
>>
>>
>>
>> 3. Normative information: Information as the use value provided by a
>> given referential relation (2) with respect to an end-directed dynamic that
>> is susceptible to contextual factors that are not directly accessible (i.e.
>> functional value or significance).
>>
>>
>>
>> Unfortunately, because of the history of using the same term in an
>> unmodified way in each relevant domain irrespective of the others there are
>> often pointless arguments of a purely definitional nature.
>>
>>
>>
>> In linguistic theory an analogous three-part hierarchic partitioning of
>> theory IS widely accepted.
>>
>>
>>
>> 1. syntax
>>
>> 2. semantics
>>
>> 3. pragmatics
>>
>>
>>
>> Thus by analogy some have proposed the distinction between
>>
>>
>>
>> 1. syntactic information (aka Shannon)
>>
>> 2. semantic information (aka meaning)
>>
>> 3. pragmatic information (aka useful information)
>>
>>
>>
>> This has also often been applied to the philosophy of information (e.g.
>> see The Stanford Dictionary of Philosophy entry for ‘information’).
>> Unfortunately, the language-centric framing of this distinction can be
>> somewhat misleading. The metaphoric extension of the terms ‘syntax’ and
>> ‘semantics’ to apply to iconic (e.g. pictoria

Re: [Fis] What is information? and What is life?

2016-12-26 Thread Loet Leydesdorff
In this respect Loet comments:

 

"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."

 

We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon. 

 

Dear Terrence and colleagues, 

 

The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question. 

 

In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.

 

I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract: 

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolution with the computation of anticipatory 
systems can be specified; but the resulting puzzles are a subject for future 
research.

Best,

Loet

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

  l...@leydesdorff.net ;  
 http://www.leydesdorff.net/ 
Associate Faculty,   SPRU, University of Sussex; 

Guest Professor   Zhejiang Univ., Hangzhou; 
Visiting Professor,   ISTIC, Beijing;

Visiting Professor,   Birkbeck, University of London; 

  
http://scholar.google.com/citations?user=ych9gNYJ&hl=en

 

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-29 Thread Dai Griffiths

Thanks Stan,

Yes, it's a powerful and useful process.

My problem is that in this list, and in other places were such matters 
are discussed, we don't seem to be able to agree on the big picture, and 
the higher up the generalisations we go, the less we agree.


I'd like to keep open the possibility that we might be yoking ideas 
together which it may be more useful to keep apart. We are dealing with 
messy concepts in messy configurations, which may not always map neatly 
onto a generalisation model.


Dai


On 22/12/16 16:45, Stanley N Salthe wrote:


Dai --

{phenomenon 1}

{phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}

{phenomenon 3}

The process from left to right is generalization.

‘Information’ IS a generalization.

generalities form the substance of philosophy. Info happens to a case

 of generalization which can be mathematized, which in turn allows

 it to be generalized even more.

So, what’s the problem?

STAN


On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths 
mailto:dai.griffith...@gmail.com>> wrote:


>  Information is not “something out there” which “exists”
otherwise than as our construct.

I agree with this. And I wonder to what extent our problems in
discussing information come from our desire to shoe-horn many
different phenomena into the same construct. It would be possible
to disaggregate the construct. It be possible to discuss the
topics which we address on this list without using the word
'information'. We could discuss redundancy, variety, constraint,
meaning, structural coupling, coordination, expectation, language,
etc.

In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our
discussions might become (even more) remote from everyday human
experience. But many scientific discussions are remote from
everyday human experience.

Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms
of bits of information.

Alternatively: the expected information content /H /of a
probability distribution is .

/H/is further defined as probabilistic entropy using Gibb’s
formulation of the entropy .

This definition of information is an operational definition. In
my opinion, we do not need an essentialistic definition by
answering the question of “what is information?” As the
discussion on this list demonstrates, one does not easily agree
on an essential answer; one can answer the question “how is
information defined?” Information is not “something out there”
which “exists” otherwise than as our construct.

Using essentialistic definitions, the discussion tends not to
move forward. For example, Stuart Kauffman’s and Bob Logan’s
(2007) definition of information “as natural selection assembling
the very constraints on the release of energy that then
constitutes work and the propagation of organization.” I asked
several times what this means and how one can measure this
information. Hitherto, I only obtained the answer that colleagues
who disagree with me will be cited. JAnother answer was that
“counting” may lead to populism. J

Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

<mailto:l...@leydesdorff.net>l...@leydesdorff.net
<mailto:l...@leydesdorff.net> ;
<http://www.leydesdorff.net/>http://www.leydesdorff.net/
Associate Faculty, SPRU,
<http://www.sussex.ac.uk/spru/>University of Sussex;

Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
Hangzhou; Visiting Professor, ISTIC,
<http://www.istic.ac.cn/Eng/brief_en.html>Beijing;

Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University
of London;


<http://scholar.google.com/citations?user=ych9gNYJ&hl=en>http://scholar.google.com/citations?user=ych9gNYJ&hl=en
<http://scholar.google.com/citations?user=ych9gNYJ&hl=en>

*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net <mailto:l...@leydesdorff.net>
    *Cc:* James Peters; u...@umces.edu <mailto:u...@umces.edu>; Alex
Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of
information stated below.

1. the definition of information as uncertainty is
counter-intuitive ("bizarre"); (p. 27)

I agree.  I struggled with this definition for a long time before
realising that

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Terrence W. DEACON
rganization.” I asked several times what this means and how one can
>> measure this information. Hitherto, I only obtained the answer that
>> colleagues who disagree with me will be cited. J Another answer was that
>> “counting” may lead to populism. J
>>
>>
>>
>> Best,
>>
>> Loet
>>
>>
>> --
>>
>> Loet Leydesdorff
>>
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>>
>>  l...@leydesdorff.net ;
>> <http://www.leydesdorff.net/> <http://www.leydesdorff.net/>
>> http://www.leydesdorff.net/
>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>>
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>
>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>> London;
>>
>> <http://scholar.google.com/citations?user=ych9gNYJ&hl=en>
>> <http://scholar.google.com/citations?user=ych9gNYJ&hl=en>
>> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>>
>>
>>
>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com ]
>>
>> *Sent:* Monday, December 19, 2016 12:48 PM
>> *To:* l...@leydesdorff.net
>> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS
>> Webinar
>> *Subject:* Re: [Fis] What is information? and What is life?
>>
>>
>>
>> List,
>>
>>
>>
>> Please allow me to respond to Loet about the definition of information
>> stated below.
>>
>>
>>
>> 1. the definition of information as uncertainty is counter-intuitive
>> ("bizarre"); (p. 27)
>>
>>
>>
>> I agree.  I struggled with this definition for a long time before
>> realising that Shannon was really discussing "amount of information" or the
>> number of bits needed to convey a message.  He was looking for a formula
>> that would provide an accurate estimate of the number of bits needed to
>> convey a message and realised that the amount of information (number of
>> bits) needed to convey a message was dependent on the "amount" of
>> uncertainty that had to be eliminated and so he equated these.
>>
>>
>>
>> It makes sense to do this, but we must distinguish between "amount of
>> information" and "information".  For example, we can measure amount of
>> water in liters, but this does not tell us what water is and likewise the
>> measure we use for "amount of information" does not tell us what
>> information is. We can, for example equate the amount of water needed to
>> fill a container with the volume of the container, but we should not think
>> that water is therefore identical to an empty volume.  Similarly we should
>> not think that information is identical to uncertainty.
>>
>>
>>
>> By equating the number of bits needed to convey a message with the
>> "amount of uncertainty" that has to be eliminated Shannon, in effect,
>> equated opposites so that he could get an estimate of the number of bits
>> needed to eliminate the uncertainty.  We should not therefore consider that
>> this equation establishes what information is.
>>
>>
>>
>> Dick
>>
>>
>>
>>
>>
>> On 18 December 2016 at 15:05, Loet Leydesdorff < 
>> l...@leydesdorff.net> wrote:
>>
>> Dear James and colleagues,
>>
>>
>>
>> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
>> contribution:
>>
>>
>>
>> 1. the definition of information as uncertainty is counter-intuitive
>> ("bizarre"); (p. 27)
>>
>> 2. "In particular, information must not be confused with meaning." (p. 8)
>>
>>
>>
>> The definition of information as relevant for a system of reference
>> confuses information with "meaningful information" and thus sacrifices the
>> surplus value of Shannon's counter-intuitive definition.
>>
>>
>>
>> information observer
>>
>>
>>
>> that integrates interactive processes such as
>>
>>
>>
>> physical interactions such photons stimulating the retina of the eye,
>> human-machine interactions (this is the level that Shannon lives on),
>> biological interaction such body temperature relative to touch ice or heat
>> sourc

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Francesco Rizzo
> wrote:
>>
>>> >  Information is not “something out there” which “exists” otherwise
>>> than as our construct.
>>>
>>> I agree with this. And I wonder to what extent our problems in
>>> discussing information come from our desire to shoe-horn many different
>>> phenomena into the same construct. It would be possible to disaggregate the
>>> construct. It be possible to discuss the topics which we address on this
>>> list without using the word 'information'. We could discuss redundancy,
>>> variety, constraint, meaning, structural coupling, coordination,
>>> expectation, language, etc.
>>>
>>> In what ways would our explanations be weakened?
>>>
>>> In what ways might we gain in clarity?
>>>
>>> If we were to go down this road, we would face the danger that our
>>> discussions might become (even more) remote from everyday human experience.
>>> But many scientific discussions are remote from everyday human experience.
>>>
>>> Dai
>>> On 20/12/16 08:26, Loet Leydesdorff wrote:
>>>
>>> Dear colleagues,
>>>
>>>
>>>
>>> A distribution contains uncertainty that can be measured in terms of
>>> bits of information.
>>>
>>> Alternatively: the expected information content *H *of a probability
>>> distribution is .
>>>
>>> *H* is further defined as probabilistic entropy using Gibb’s
>>> formulation of the entropy .
>>>
>>>
>>>
>>> This definition of information is an operational definition. In my
>>> opinion, we do not need an essentialistic definition by answering the
>>> question of “what is information?” As the discussion on this list
>>> demonstrates, one does not easily agree on an essential answer; one can
>>> answer the question “how is information defined?” Information is not
>>> “something out there” which “exists” otherwise than as our construct.
>>>
>>>
>>>
>>> Using essentialistic definitions, the discussion tends not to move
>>> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
>>> of information “as natural selection assembling the very constraints on the
>>> release of energy that then constitutes work and the propagation of
>>> organization.” I asked several times what this means and how one can
>>> measure this information. Hitherto, I only obtained the answer that
>>> colleagues who disagree with me will be cited. J Another answer was
>>> that “counting” may lead to populism. J
>>>
>>>
>>>
>>> Best,
>>>
>>> Loet
>>>
>>>
>>> --
>>>
>>> Loet Leydesdorff
>>>
>>> Professor, University of Amsterdam
>>> Amsterdam School of Communication Research (ASCoR)
>>>
>>>  l...@leydesdorff.net ;
>>> <http://www.leydesdorff.net/> <http://www.leydesdorff.net/>
>>> http://www.leydesdorff.net/
>>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>>> Sussex;
>>>
>>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>>> Hangzhou; Visiting Professor, ISTIC,
>>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>>
>>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>>> London;
>>>
>>> <http://scholar.google.com/citations?user=ych9gNYJ&hl=en>
>>> <http://scholar.google.com/citations?user=ych9gNYJ&hl=en>
>>> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>>>
>>>
>>>
>>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com
>>> ]
>>> *Sent:* Monday, December 19, 2016 12:48 PM
>>> *To:* l...@leydesdorff.net
>>> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS
>>> Webinar
>>> *Subject:* Re: [Fis] What is information? and What is life?
>>>
>>>
>>>
>>> List,
>>>
>>>
>>>
>>> Please allow me to respond to Loet about the definition of information
>>> stated below.
>>>
>>>
>>>
>>> 1. the definition of information as uncertainty is counter-intuitive
>>> ("bizarre"); (p. 27)
>>>
>>>
>>>
>>> I agree.  I struggled with this definition for a long time before
>>> realising that Shannon was really discussing "amount of

Re: [Fis] What is information? and What is life?

2016-12-30 Thread Terrence W. DEACON
n an essential answer; one can
>>>> answer the question “how is information defined?” Information is not
>>>> “something out there” which “exists” otherwise than as our construct.
>>>>
>>>>
>>>>
>>>> Using essentialistic definitions, the discussion tends not to move
>>>> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
>>>> of information “as natural selection assembling the very constraints on the
>>>> release of energy that then constitutes work and the propagation of
>>>> organization.” I asked several times what this means and how one can
>>>> measure this information. Hitherto, I only obtained the answer that
>>>> colleagues who disagree with me will be cited. J Another answer was
>>>> that “counting” may lead to populism. J
>>>>
>>>>
>>>>
>>>> Best,
>>>>
>>>> Loet
>>>>
>>>>
>>>> --
>>>>
>>>> Loet Leydesdorff
>>>>
>>>> Professor, University of Amsterdam
>>>> Amsterdam School of Communication Research (ASCoR)
>>>>
>>>>  l...@leydesdorff.net ;
>>>> <http://www.leydesdorff.net/> <http://www.leydesdorff.net/>
>>>> http://www.leydesdorff.net/
>>>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>>>> Sussex;
>>>>
>>>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>>>> Hangzhou; Visiting Professor, ISTIC,
>>>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>>>
>>>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>>>> London;
>>>>
>>>> <http://scholar.google.com/citations?user=ych9gNYJ&hl=en>
>>>> <http://scholar.google.com/citations?user=ych9gNYJ&hl=en>
>>>> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>>>>
>>>>
>>>>
>>>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com
>>>> ]
>>>> *Sent:* Monday, December 19, 2016 12:48 PM
>>>> *To:* l...@leydesdorff.net
>>>> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS
>>>> Webinar
>>>> *Subject:* Re: [Fis] What is information? and What is life?
>>>>
>>>>
>>>>
>>>> List,
>>>>
>>>>
>>>>
>>>> Please allow me to respond to Loet about the definition of information
>>>> stated below.
>>>>
>>>>
>>>>
>>>> 1. the definition of information as uncertainty is counter-intuitive
>>>> ("bizarre"); (p. 27)
>>>>
>>>>
>>>>
>>>> I agree.  I struggled with this definition for a long time before
>>>> realising that Shannon was really discussing "amount of information" or the
>>>> number of bits needed to convey a message.  He was looking for a formula
>>>> that would provide an accurate estimate of the number of bits needed to
>>>> convey a message and realised that the amount of information (number of
>>>> bits) needed to convey a message was dependent on the "amount" of
>>>> uncertainty that had to be eliminated and so he equated these.
>>>>
>>>>
>>>>
>>>> It makes sense to do this, but we must distinguish between "amount of
>>>> information" and "information".  For example, we can measure amount of
>>>> water in liters, but this does not tell us what water is and likewise the
>>>> measure we use for "amount of information" does not tell us what
>>>> information is. We can, for example equate the amount of water needed to
>>>> fill a container with the volume of the container, but we should not think
>>>> that water is therefore identical to an empty volume.  Similarly we should
>>>> not think that information is identical to uncertainty.
>>>>
>>>>
>>>>
>>>> By equating the number of bits needed to convey a message with the
>>>> "amount of uncertainty" that has to be eliminated Shannon, in effect,
>>>> equated opposites so that he could get an estimate of the number of bits
>>>> needed to eliminate the uncertainty.  We should not therefore consider that
>>>> this equation esta

Re: [Fis] What is information? and What is life?

2016-12-31 Thread Loet Leydesdorff
ion by answering the question of “what is 
information?” As the discussion on this list demonstrates, one does not easily 
agree on an essential answer; one can answer the question “how is information 
defined?” Information is not “something out there” which “exists” otherwise 
than as our construct.

 

Using essentialistic definitions, the discussion tends not to move forward. For 
example, Stuart Kauffman’s and Bob Logan’s (2007) definition of information “as 
natural selection assembling the very constraints on the release of energy that 
then constitutes work and the propagation of organization.” I asked several 
times what this means and how one can measure this information. Hitherto, I 
only obtained the answer that colleagues who disagree with me will be cited. J 
Another answer was that “counting” may lead to populism. J 

 

Best,

Loet

 


  _  


Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net ; http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

http://scholar.google.com/citations?user=ych9gNYJ&hl=en

 

From: Dick Stoute [mailto:dick.sto...@gmail.com] 
Sent: Monday, December 19, 2016 12:48 PM
To: l...@leydesdorff.net
Cc: James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
Subject: Re: [Fis] What is information? and What is life?

 

List,

 

Please allow me to respond to Loet about the definition of information stated 
below.  

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

 

I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

 

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

 

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

 

Dick

 

 

On 18 December 2016 at 15:05, Loet Leydesdorff  wrote:

Dear James and colleagues, 

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8) 

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as 

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows within the 
Shannon framework without needing an observer. I use this

Re: [Fis] What is information? and What is life?

2016-12-31 Thread Francesco Rizzo
ormation is not “something out there” which “exists” otherwise than
> as our construct.
>
> I agree with this. And I wonder to what extent our problems in discussing
> information come from our desire to shoe-horn many different phenomena into
> the same construct. It would be possible to disaggregate the construct. It
> be possible to discuss the topics which we address on this list without
> using the word 'information'. We could discuss redundancy, variety,
> constraint, meaning, structural coupling, coordination, expectation,
> language, etc.
>
> In what ways would our explanations be weakened?
>
> In what ways might we gain in clarity?
>
> If we were to go down this road, we would face the danger that our
> discussions might become (even more) remote from everyday human experience.
> But many scientific discussions are remote from everyday human experience.
>
> Dai
>
> On 20/12/16 08:26, Loet Leydesdorff wrote:
>
> Dear colleagues,
>
>
>
> A distribution contains uncertainty that can be measured in terms of bits
> of information.
>
> Alternatively: the expected information content *H *of a probability
> distribution is .
>
> *H* is further defined as probabilistic entropy using Gibb’s formulation
> of the entropy .
>
>
>
> This definition of information is an operational definition. In my
> opinion, we do not need an essentialistic definition by answering the
> question of “what is information?” As the discussion on this list
> demonstrates, one does not easily agree on an essential answer; one can
> answer the question “how is information defined?” Information is not
> “something out there” which “exists” otherwise than as our construct.
>
>
>
> Using essentialistic definitions, the discussion tends not to move
> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
> of information “as natural selection assembling the very constraints on the
> release of energy that then constitutes work and the propagation of
> organization.” I asked several times what this means and how one can
> measure this information. Hitherto, I only obtained the answer that
> colleagues who disagree with me will be cited. J Another answer was that
> “counting” may lead to populism. J
>
>
>
> Best,
>
> Loet
>
>
> --
>
> Loet Leydesdorff
>
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; http://www.leydesdorff.net/
> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
> Sussex;
>
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>
> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
> London;
>
> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>
>
>
> *From:* Dick Stoute [mailto:dick.sto...@gmail.com ]
>
> *Sent:* Monday, December 19, 2016 12:48 PM
> *To:* l...@leydesdorff.net
> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> List,
>
>
>
> Please allow me to respond to Loet about the definition of information
> stated below.
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
>
>
> I agree.  I struggled with this definition for a long time before
> realising that Shannon was really discussing "amount of information" or the
> number of bits needed to convey a message.  He was looking for a formula
> that would provide an accurate estimate of the number of bits needed to
> convey a message and realised that the amount of information (number of
> bits) needed to convey a message was dependent on the "amount" of
> uncertainty that had to be eliminated and so he equated these.
>
>
>
> It makes sense to do this, but we must distinguish between "amount of
> information" and "information".  For example, we can measure amount of
> water in liters, but this does not tell us what water is and likewise the
> measure we use for "amount of information" does not tell us what
> information is. We can, for example equate the amount of water needed to
> fill a container with the volume of the container, but we should not think
> that water is therefore identical to an empty volume.  Similarly we should
> not think that information is identical to uncertainty.
>
>
>
> By equating the number of bits needed to convey a message with the "amount
> of uncertainty"

Re: [Fis] What is information? and What is life?

2017-01-10 Thread John Collier
Dear List,

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

I might add that constructivism, with its positivist underpinnings, tends to 
lead to nominalism and relativism about whatever is out there. I believe that 
this is a major hindrance to a unified understanding. I understand that it 
appeared in reaction to an overzealous and simplistic realism about science and 
other areas, but I think it through the baby out with the bathwater.

I have been really ill, so my lack of communication. I am pleased to see this 
discussion, which is necessary for the field to develop maturity. I thought I 
should add my bit, and with everyone a Happy New Year, with all its 
possibilities.

Warmest regards to everyone,
John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: December 31, 2016 12:16 AM
To: 'Terrence W. DEACON' ; 'Dai Griffiths' 
; 'Foundations of Information Science Information 
Science' 
Subject: Re: [Fis] What is information? and What is life?

We agree that such a theory is a ways off, though you some are far more 
pessimisitic about its possibility than me. I believe that we would do best to 
focus on the hole that needs filling in rather than assuming that it is an 
unfillable given.

Dear Terrence and colleagues,

It is not a matter of pessimism. We have the example of “General Systems 
Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one 
realized the biological metaphor driving it. In my opinion, we have become 
reflexively skeptical about claims of “generality” because we know the 
statements are framed within paradigms. Translations are needed in this 
fractional manifold.

I agree that we are moving in a fruitful direction. Your book “Incomplete 
Nature” and “The Symbolic Species” have been important. The failing options 
cannot be observed, but have to be constructed culturally, that is, in 
discourse. It seems to me that we need a kind of calculus of redundancy. 
Perspectives which are reflexively aware of this need and do not assume an 
unproblematic “given” or “natural” are perhaps to be privileged nonetheless. 
The unobservbable options have first to be specified and we need theory 
(hypotheses) for this. Perhaps, this epistemological privilege can be used as a 
vantage point.

There is an interesting relation to Husserl’s Critique of the European Sciences 
(1935): The failing (or forgotten) dimension is grounded in “intersubjective 
intentionality.” Nowadays, we would call this “discourse”. How are discourses 
structured and how can they be translated for the purpose of offering this 
“foundation”?

Happy New Year,
Loet

My modest suggestion is only that in the absence of a unifying theory we should 
not privilege one partial theory over others and that in the absence of a 
global general theory we need to find terminology that clearly identifies the 
level at which the concept is being used. Lacking this, we end up debating 
incompatible definitions, and defending our favored one that either excludes or 
includes issues of reference and significance or else assumes or denies the 
relevance of human interpreters. With different participants interested in 
different levels and applications of the information concept—from physics, to 
computation, to neuroscience, to biosemiotics, to language, to art, 
etc.—failure to mark this diversity will inevitably lead us in circles.

I urge humility with precision and an eye toward synthesis.

Happy new year to all.\

— Terry

On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths 
mailto:dai.griffith...@gmail.com>> wrote:

Thanks Stan,

Yes, it's a powerful and useful process.
My problem is that in this list, and in other places were such matters are 
discussed, we don't seem to be able to agree on the big picture, and the higher 
up the generalisations we go, the less we agree.

I'd like to keep open the possibility that we might be yoking ideas together 
which it may be more useful to keep apart. We are dealing 

Re: [Fis] What is information? and What is life?

2017-01-10 Thread Terrence W. DEACON
Leot remarks:

"... we need a kind of calculus of redundancy."

I agree whole-heartedly.

What for Shannon was the key to error-correction is thus implicitly
normative. But of course assessment of normativity (accurate/inacurate,
useful/unuseful, significant/insignificant) must necessarily involve an
"outside" perspective, i.e. more than merely the statistics of sign medium
chartacteristics. Redundancy is also implicit in concepts like
communication, shared understanding, iconism, and Fano's "mutual
information." But notice too that redundancy is precisely non-information
in a strictly statistical understanding of that concept; a redundant
message is not itself "news" — and yet it can reduce the uncertainty of
what is "message" and what is "noise." It is my intuition that by
developing a formalization (e.g. a "calculus") using the complemetary
notions of redundancy and constraint that we will ultimately be able
formulate a route from Shannon to the higher-order conceptions of
information, in which referential and normative features can be precisely
formulated.

There is an open door, though it still seems pretty dark on the other side.
So one must risk stumbling in order to explore that space.

Happy 2017, Terry

On Sat, Jan 7, 2017 at 9:02 AM, John Collier  wrote:

> Dear List,
>
>
>
> I agree with Terry that we should not be bound by our own partial
> theories. We need an integrated view of information that shows its
> relations in all of its various forms. There is a family resemblance in the
> ways it is used, and some sort of taxonomy can be constructed. I recommend
> that of Luciano Floridi. His approach is not unified (unlike my own,
> reported on this list), but compatible with it, and is a place to start,
> though it needs expansion and perhaps modification. There may be some
> unifying concept of information, but its application to all the various
> ways it has been used will not be obvious, and a sufficiently general
> formulation my well seem trivial, especially to those interested in the
> vital communicative and meaningful aspects of information. I also agree
> with Loet that pessimism, however justified, is not the real problem. To
> some extent it is a matter of maturity, which takes both time and
> development, not to mention giving up cherished juvenile enthusiasms.
>
>
>
> I might add that constructivism, with its positivist underpinnings, tends
> to lead to nominalism and relativism about whatever is out there. I believe
> that this is a major hindrance to a unified understanding. I understand
> that it appeared in reaction to an overzealous and simplistic realism about
> science and other areas, but I think it through the baby out with the
> bathwater.
>
>
>
> I have been really ill, so my lack of communication. I am pleased to see
> this discussion, which is necessary for the field to develop maturity. I
> thought I should add my bit, and with everyone a Happy New Year, with all
> its possibilities.
>
>
>
> Warmest regards to everyone,
>
> John
>
>
>
> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Loet
> Leydesdorff
> *Sent:* December 31, 2016 12:16 AM
> *To:* 'Terrence W. DEACON' ; 'Dai Griffiths' <
> dai.griffith...@gmail.com>; 'Foundations of Information Science
> Information Science' 
>
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> We agree that such a theory is a ways off, though you some are far more
> pessimisitic about its possibility than me. I believe that we would do best
> to focus on the hole that needs filling in rather than assuming that it is
> an unfillable given.
>
>
>
> Dear Terrence and colleagues,
>
>
>
> It is not a matter of pessimism. We have the example of “General Systems
> Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one
> realized the biological metaphor driving it. In my opinion, we have become
> reflexively skeptical about claims of “generality” because we know the
> statements are framed within paradigms. Translations are needed in this
> fractional manifold.
>
>
>
> I agree that we are moving in a fruitful direction. Your book “Incomplete
> Nature” and “The Symbolic Species” have been important. The failing options
> cannot be observed, but have to be constructed culturally, that is, in
> discourse. It seems to me that we need a kind of calculus of redundancy.
> Perspectives which are reflexively aware of this need and do not assume an
> unproblematic “given” or “natural” are perhaps to be privileged
> nonetheless. The unobservbable options have first to be specified and we
> need theory (hypotheses) for this. Perhaps, this epistemologic

Re: [Fis] What is information? and What is life?

2017-01-11 Thread Christophe
Dear Terry,
Are you really sure that looking at linking Shannon to higher-order conceptions 
of information like meaning is a realistic ambition?
I compare that to linking the width of a street to the individual motivations 
of the persons that will walk in the street.
As we know, Shannon is to measure a communication channel capacity. It is not 
about the possible meanings of the information that may transit through the 
channel.
Information goes through a communication channel because agents want to 
communicate, to exchange meaningful information (the 'outside perspective' as 
you say). And meanings do not exist by themselves. Meaningful information are 
generated by agents that have reasons for that. Animals manage meanings in 
order to stay alive (as individual & as species). Human motivation/constraints 
are more complex but they are the sources of our meaning generations.
We agree that information is not to be confused with meaning. However, on a 
pragmatic standpoint the two cannot be separated. But this does not imply, I 
feel, that Shannon is to be linked to the meaning of information.
For me the core of the subject is with meaning generation. Why and how is 
meaningful information generated? (https://philpapers.org/rec/MENCOI)

All the best to all for 2017.
Christophe


De : Fis  de la part de Terrence W. DEACON 

Envoyé : samedi 7 janvier 2017 20:15
À : John Collier
Cc : Foundations of Information Science Information Science; Dai Griffiths
Objet : Re: [Fis] What is information? and What is life?

Leot remarks:

"... we need a kind of calculus of redundancy."

I agree whole-heartedly.

What for Shannon was the key to error-correction is thus implicitly normative. 
But of course assessment of normativity (accurate/inacurate, useful/unuseful, 
significant/insignificant) must necessarily involve an "outside" perspective, 
i.e. more than merely the statistics of sign medium chartacteristics. 
Redundancy is also implicit in concepts like communication, shared 
understanding, iconism, and Fano's "mutual information." But notice too that 
redundancy is precisely non-information in a strictly statistical understanding 
of that concept; a redundant message is not itself "news" — and yet it can 
reduce the uncertainty of what is "message" and what is "noise." It is my 
intuition that by developing a formalization (e.g. a "calculus") using the 
complemetary notions of redundancy and constraint that we will ultimately be 
able formulate a route from Shannon to the higher-order conceptions of 
information, in which referential and normative features can be precisely 
formulated.

There is an open door, though it still seems pretty dark on the other side. So 
one must risk stumbling in order to explore that space.

Happy 2017, Terry

On Sat, Jan 7, 2017 at 9:02 AM, John Collier 
mailto:colli...@ukzn.ac.za>> wrote:
Dear List,

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

I might add that constructivism, with its positivist underpinnings, tends to 
lead to nominalism and relativism about whatever is out there. I believe that 
this is a major hindrance to a unified understanding. I understand that it 
appeared in reaction to an overzealous and simplistic realism about science and 
other areas, but I think it through the baby out with the bathwater.

I have been really ill, so my lack of communication. I am pleased to see this 
discussion, which is necessary for the field to develop maturity. I thought I 
should add my bit, and with everyone a Happy New Year, with all its 
possibilities.

Warmest regards to everyone,
John

From: Fis 
[mailto:fis-boun...@listas.unizar.es<mailto:fis-boun...@listas.unizar.es>] On 
Behalf Of Loet Leydesdorff
Sent: December 31, 2016 12:16 AM
To: 'Terrence W. DEACON' mailto:dea...@berkeley.edu>>; 
'Dai Griffiths' mailto:dai.griffith...@gmail.com>>; 
'Foundations of Information 

Re: [Fis] What is information? and What is life?; towards a calculus of redundancy

2017-01-10 Thread Loet Leydesdorff
Toward a Calculus of Redundancy:  <https://arxiv.org/abs/1701.02455> 
The feedback arrow of expectations in knowledge-based systems

Loet Leydesdorff, Mark W. Johnson, Inga Ivanova 

(Submitted on 10 Jan 2017; https://arxiv.org/abs/1701.02455 )

 

Whereas the generation of Shannon-type information is coupled to the second law 
of thermodynamics, redundancy--that is, the complement of information to the 
maximum entropy--can be increased by further distinctions: new options can 
discursively be generated. The dynamics of discursive knowledge production thus 
infuse the historical dynamics with a cultural evolution based on expectations 
(as different from observations). We distinguish among (i) the communication of 
information, (ii) the sharing of meaning, and (iii) discursive knowledge. 
Meaning is provided from the perspective of hindsight as feedback on the 
entropy flow and thus generates redundancy. Specific meanings can selectively 
be codified as discursive knowledge; knowledge-based reconstructions enable us 
to specify expectations about future states which can be invoked in the 
present. The cycling among the dynamics of information, meaning, and knowledge 
in feedback and feedforward loops can be evaluated empirically: When mutual 
redundancy prevails over mutual information, the sign of the resulting 
information is negative indicating reduction of uncertainty because of new 
options available for realization; innovation can then be expected to flourish. 
When historical realizations prevail, innovation may be locked-in because of 
insufficient options for further development. 

 

* Comments are very welcome in this stage

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;  
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

 <http://scholar.google.com/citations?user=ych9gNYJ&hl=en> 
http://scholar.google.com/citations?user=ych9gNYJ&hl=en

 

From: Terrence W. DEACON [mailto:dea...@berkeley.edu] 
Sent: Saturday, January 07, 2017 8:15 PM
To: John Collier
Cc: l...@leydesdorff.net; Dai Griffiths; Foundations of Information Science 
Information Science
Subject: Re: [Fis] What is information? and What is life?

 

Leot remarks:

 

"... we need a kind of calculus of redundancy."

 

I agree whole-heartedly. 

 

What for Shannon was the key to error-correction is thus implicitly normative. 
But of course assessment of normativity (accurate/inacurate, useful/unuseful, 
significant/insignificant) must necessarily involve an "outside" perspective, 
i.e. more than merely the statistics of sign medium chartacteristics. 
Redundancy is also implicit in concepts like communication, shared 
understanding, iconism, and Fano's "mutual information." But notice too that 
redundancy is precisely non-information in a strictly statistical understanding 
of that concept; a redundant message is not itself "news" — and yet it can 
reduce the uncertainty of what is "message" and what is "noise." It is my 
intuition that by developing a formalization (e.g. a "calculus") using the 
complemetary notions of redundancy and constraint that we will ultimately be 
able formulate a route from Shannon to the higher-order conceptions of 
information, in which referential and normative features can be precisely 
formulated. 

 

There is an open door, though it still seems pretty dark on the other side. So 
one must risk stumbling in order to explore that space.

 

Happy 2017, Terry

 

On Sat, Jan 7, 2017 at 9:02 AM, John Collier  wrote:

Dear List,

 

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which t