Re: [Fis] Data - Reflection - Information

2017-10-27 Thread Sungchul Ji
Hi FISers,


Reading the recent posts on "information" and related issues by Terry, Joseph, 
Pedro, Mark,  Krassimir, Loet, and others suggested to me the following 
possible definition of information (see Table 1) that may be consistent with 
those proposed by Terry, Shannon, Volkenstein,  Saussure, and Peirce (as I 
understand him), to varying degrees.

Table 1.  A unified definition of information based on the  Mechanism of 
Irreversible Triadic Relation (MITR):

“Information is something that is transferred from A (e.g., the sender) to C 
(e.g., the receiver) mediated by B (e.g., sound signal) in such a manner that A 
and C become coupled, correlated, or coordinated.”

 f  g
A ---> B ---> C
 |  
 ^
 |  
  |
 ||
 h




Terry

Shannon

Volkenstein

Peirce

Saussure

A

Object

Sender

-

Object

Object

B

Sign

Message

-

Sign

Sign

C

Interpretant

Receiver

-

Interpretant

-

f

Intrinsic

Coding

Amount

Natural process

Differentiation (?)

g

Referential

Decoding

Meaning

Mental process

Arbitrariness

h

Normative

Communication

Value

Correspondence/
Communication

-





I have the feeling that that number of columns in Table 1 can be increased to 
the right significantly, as we extend the MITR-based definition of information 
to various fields of inquires in natural and human sciences.


Any suggestions, comments or corrections would be welcome.


Sung







From: Terrence W. DEACON <dea...@berkeley.edu>
Sent: Sunday, October 8, 2017 8:30 PM
To: Sungchul Ji
Cc: KrassimirMarkov; foundationofinformationscience; 钟义信
Subject: Re: [Fis] Data - Reflection - Information

Against "meaning"

I think that there is a danger of allowing our anthropocentrism to bias the 
discussion. I worry that the term 'meaning' carries too much of a linguistic 
bias.
By this I mean that it is too attractive to use language as our archtypical 
model when we talk about information.
Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general.
So I am happy to see efforts to bring in topics like music or natural signs 
like thunderstorms and would also want to cast the net well beyond humans to 
include animal calls, scent trails, and molecular signaling by hormones. And it 
is why I am more attracted to Peirce and worried about the use of Saussurean 
concepts.
Words and sentences can indeed provide meanings (as in Frege's Sinn - "sense" - 
"intension") and may also provide reference (Frege's Bedeutung - "reference" - 
"extension"), but I think that it is important to recognize that not all signs 
fit this model. Moreover,

A sneeze is often interpreted as evidence about someone's state of health, and 
a clap of thunder may indicate an approaching storm.
These can also be interpreted differently by my dog, but it is still 
information about something, even though I would not say that they mean 
something to that interpreter. Both of these phenomena can be said to provide 
reference to something other than that sound itself, but when we use such 
phrases as "it means you have a cold" or "that means that a storm is 
approaching" we are using the term "means" somewhat metaphorically (most often 
in place of the more accurate term "indicates").

And it is even more of a stretch to use this term with respect to pictures or 
diagrams.
So no one would say the a specific feature like the ears in a caricatured face 
mean something.
Though if the drawing is employed in a political cartoon e.g. with exaggerated 
ears and the whole cartoon is assigned a meaning then perhaps the exaggeration 
of this feature may become meaningful. And yet we would probably agree that 
every line of the drawing provides information contributing to that meaning.

So basically, I am advocating an effort to broaden our discussions and 
recognize that the term information applies in diverse ways to many different 
contexts. And because of this it is important to indicate the framing, whether 
physical, formal, biological, phenomenological, linguistic, etc.
For this reason, as I have suggested before, I would love to have a 
conversation in which we try to agree about which different uses of the 
information concept are appropriate for which contexts. The classic 
syntax-semantics-pragmatics distinction introduced by Cha

Re: [Fis] Data - Reflection - Information

2017-10-15 Thread Mark Johnson
Dear Loet,

I mean to be analytical too. The Pythonesque nature of my questioning leads 
naturally to recursion: What is the meaning of meaning? There's a logic in the 
recursion - Peirce, Spencer-Brown, Leibnitz, Lou Kauffman... and you have 
probed this. 

Were you or I to be part of a recursive symmetry, how would we know? Where 
would the scientia be? How would we express our knowledge? In a journal? Why 
not in a symphony? (the musicologists miss the point about music: Schoenberg 
commented once on the musical graphs of Heinrich Schenker: "where are my 
favourite tunes? Ah! There.. In those tiny notes!")

I agree that operationalisation is important. But it can (and does) happen in 
ways other than those expressed in the content of discourse.  If this topic of 
"information" is of any value, it is because it should open our senses to that. 

Best wishes,

Mark

-Original Message-
From: "Loet Leydesdorff" <l...@leydesdorff.net>
Sent: ‎15/‎10/‎2017 07:17
To: "Mark Johnson" <johnsonm...@gmail.com>; "Terrence W. DEACON" 
<dea...@berkeley.edu>; "Sungchul Ji" <s...@pharmacy.rutgers.edu>
Cc: "foundationofinformationscience" <fis@listas.unizar.es>
Subject: Re[2]: [Fis] Data - Reflection - Information

Dear Mark:


Do we want to defend a definition of meaning which is tied to scientific 
practice as we know it? Would that be too narrow? Ours may not be the only way 
of doing science... 
I meant my remarks analytically. You provide them with a normative turn as 
defensive against alternative ways of doing science.


A non-discursive science might be possible - a science based around shared 
musical experience, or meditation, for example. Or even Hesse's 
"Glasperlenspiel"... Higher level coordination need not necessarily occur in 
language. Our communication technologies may one day give us new 
post-linguistic ways of coordinating ourselves. 
Why should one wish to consider this as science? One can make music together 
without doing science. Musicology, however, is discursive reasoning about these 
practices.


Codification is important in our science as we know it. But it should also be 
said that our science is blind to many things. Its reductionism prevents 
effective interdisciplinary inquiry, it struggles to reconcile practices, 
bodies, and egos, and its recent obsession with journal publication has 
produced the conditions of Babel which has fed the pathology in our 
institutions. There's less meaning in the academy than there was 50 years ago.
This is a question with a Monty Python flavor: what is the meaning of science? 
what is the meaning of life?


The implication is that our distinguishing between information and meaning in 
science may be an epiphenomenon of something deeper.
One can always ask for "something deeper". The answers, however, tend to become 
religious. I am interested in operationalization and design.


Best,
Loet




Best wishes,

Mark




From: Loet Leydesdorff
Sent: ‎14/‎10/‎2017 16:06
To: Terrence W. DEACON; Sungchul Ji
Cc: foundationofinformationscience
Subject: Re: [Fis] Data - Reflection - Information


Dear Terry and colleagues, 


"Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" of 
mankind. "Meaning" can be provided by non-humans such as animals or networks, 
but distinguishing between the information content and the meaning of a message 
requires a discourse. The discourse enables us to codify the meaning of the 
information at the supra-individual level. Discursive knowledge is based on 
further codification of this intersubjective meaning. All categories used, for 
example, in this discussion are codified in scholarly discourses. The 
discourse(s) provide(s) the top of the hierarchy that controls given the 
cybernetic principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" instead of 
"meaning". Perhaps, this has advantages; but I am not so sure that the 
difference is more than semantic. In Cartesian Meditations (1929) he argues 
that this intersubjective intentionality provides us with the basis of an 
empirical philosophy of science. The sciences do not begin with observations, 
but with the specification of expectations in discourses. A predator also 
observes his prey, but in scholarly discourses, systematic observations serve 
the update of codified (that is, theoretical) expectations.


Best,
Loet___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-15 Thread Loet Leydesdorff

Dear Mark:

Do we want to defend a definition of meaning which is tied to 
scientific practice as we know it? Would that be too narrow? Ours may 
not be the only way of doing science...
I meant my remarks analytically. You provide them with a normative turn 
as defensive against alternative ways of doing science.


A non-discursive science might be possible - a science based around 
shared musical experience, or meditation, for example. Or even Hesse's 
"Glasperlenspiel"... Higher level coordination need not necessarily 
occur in language. Our communication technologies may one day give us 
new post-linguistic ways of coordinating ourselves.
Why should one wish to consider this as science? One can make music 
together without doing science. Musicology, however, is discursive 
reasoning about these practices.


Codification is important in our science as we know it. But it should 
also be said that our science is blind to many things. Its reductionism 
prevents effective interdisciplinary inquiry, it struggles to reconcile 
practices, bodies, and egos, and its recent obsession with journal 
publication has produced the conditions of Babel which has fed the 
pathology in our institutions. There's less meaning in the academy than 
there was 50 years ago.
This is a question with a Monty Python flavor: what is the meaning of 
science? what is the meaning of life?


The implication is that our distinguishing between information and 
meaning in science may be an epiphenomenon of something deeper.
One can always ask for "something deeper". The answers, however, tend to 
become religious. I am interested in operationalization and design.


Best,
Loet




Best wishes,

Mark


From: Loet Leydesdorff <mailto:l...@leydesdorff.net>
Sent: ‎14/‎10/‎2017 16:06
To: Terrence W. DEACON <mailto:dea...@berkeley.edu>; Sungchul Ji 
<mailto:s...@pharmacy.rutgers.edu>

Cc: foundationofinformationscience <mailto:fis@listas.unizar.es>
Subject: Re: [Fis] Data - Reflection - Information

Dear Terry and colleagues,

"Language is rather the special case, the most unusual communicative 
adaptation to ever have evolved, and one that grows out of and depends 
on informationa/semiotic capacities shared with other species and with 
biology in general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" 
of mankind. "Meaning" can be provided by non-humans such as animals or 
networks, but distinguishing between the information content and the 
meaning of a message requires a discourse. The discourse enables us to 
codify the meaning of the information at the supra-individual level. 
Discursive knowledge is based on further codification of this 
intersubjective meaning. All categories used, for example, in this 
discussion are codified in scholarly discourses. The discourse(s) 
provide(s) the top of the hierarchy that controls given the cybernetic 
principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" 
instead of "meaning". Perhaps, this has advantages; but I am not so 
sure that the difference is more than semantic. In Cartesian 
Meditations (1929) he argues that this intersubjective intentionality 
provides us with the basis of an empirical philosophy of science. The 
sciences do not begin with observations, but with the specification of 
expectations in discourses. A predator also observes his prey, but in 
scholarly discourses, systematic observations serve the update of 
codified (that is, theoretical) expectations.


Best,
Loet

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-14 Thread Mark Johnson
Dear Loet, 

When you say "distinguishing between the information content and the meaning of 
a message requires a discourse" this is, I think, a position regarding what 
scientific discourse does. There are, of course, competing descriptions of what 
scientific discourse does.  Does your "meaning" refers to the meaning of 
scientific discovery? Do we want to defend a definition of meaning which is 
tied to scientific practice as we know it? Would that be too narrow? Ours may 
not be the only way of doing science... 

A non-discursive science might be possible - a science based around shared 
musical experience, or meditation, for example. Or even Hesse's 
"Glasperlenspiel"... Higher level coordination need not necessarily occur in 
language. Our communication technologies may one day give us new 
post-linguistic ways of coordinating ourselves. 

Codification is important in our science as we know it. But it should also be 
said that our science is blind to many things. Its reductionism prevents 
effective interdisciplinary inquiry, it struggles to reconcile practices, 
bodies, and egos, and its recent obsession with journal publication has 
produced the conditions of Babel which has fed the pathology in our 
institutions. There's less meaning in the academy than there was 50 years ago.

The business of sense and reference which Terry refers to (and which provided a 
foundation for Husserl) is indeed problematic. Some forms of communication have 
only sense and yet there is coordination, emotion and meaning.  Peirce saw 
something different in the underlying symmetry of communication. This is in 
Bateson too (symmetrical/asymmetrical schizmogenesis).

It may be that it is symmetrical principles underpin quantum mechanical 
phenomena like entanglement; they certainly pervade biology. Medieval logicians 
may have seen this: Duns Scotus's ideas on "synchronic contingency" for 
example, mirror what quantum physicists are describing.

The implication is that our distinguishing between information and meaning in 
science may be an epiphenomenon of something deeper.

Best wishes,

Mark



-Original Message-
From: "Loet Leydesdorff" <l...@leydesdorff.net>
Sent: ‎14/‎10/‎2017 16:06
To: "Terrence W. DEACON" <dea...@berkeley.edu>; "Sungchul Ji" 
<s...@pharmacy.rutgers.edu>
Cc: "foundationofinformationscience" <fis@listas.unizar.es>
Subject: Re: [Fis] Data - Reflection - Information

Dear Terry and colleagues, 


"Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" of 
mankind. "Meaning" can be provided by non-humans such as animals or networks, 
but distinguishing between the information content and the meaning of a message 
requires a discourse. The discourse enables us to codify the meaning of the 
information at the supra-individual level. Discursive knowledge is based on 
further codification of this intersubjective meaning. All categories used, for 
example, in this discussion are codified in scholarly discourses. The 
discourse(s) provide(s) the top of the hierarchy that controls given the 
cybernetic principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" instead of 
"meaning". Perhaps, this has advantages; but I am not so sure that the 
difference is more than semantic. In Cartesian Meditations (1929) he argues 
that this intersubjective intentionality provides us with the basis of an 
empirical philosophy of science. The sciences do not begin with observations, 
but with the specification of expectations in discourses. A predator also 
observes his prey, but in scholarly discourses, systematic observations serve 
the update of codified (that is, theoretical) expectations.


Best,
Loet___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-14 Thread Loet Leydesdorff

Dear Terry and colleagues,

"Language is rather the special case, the most unusual communicative 
adaptation to ever have evolved, and one that grows out of and depends 
on informationa/semiotic capacities shared with other species and with 
biology in general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" 
of mankind. "Meaning" can be provided by non-humans such as animals or 
networks, but distinguishing between the information content and the 
meaning of a message requires a discourse. The discourse enables us to 
codify the meaning of the information at the supra-individual level. 
Discursive knowledge is based on further codification of this 
intersubjective meaning. All categories used, for example, in this 
discussion are codified in scholarly discourses. The discourse(s) 
provide(s) the top of the hierarchy that controls given the cybernetic 
principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" 
instead of "meaning". Perhaps, this has advantages; but I am not so sure 
that the difference is more than semantic. In Cartesian Meditations 
(1929) he argues that this intersubjective intentionality provides us 
with the basis of an empirical philosophy of science. The sciences do 
not begin with observations, but with the specification of expectations 
in discourses. A predator also observes his prey, but in scholarly 
discourses, systematic observations serve the update of codified (that 
is, theoretical) expectations.


Best,
Loet

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-13 Thread Robert E. Ulanowicz
Dear Mark,

Thank you for your interest in my FIS paper!


I didn't intend by it to infer that Shannon-class measures were the
ultimate tool for information science, only to argue against prematurely
rejecting that thrust entirely -- as so many do. By looking at Bayesian
forms of the Shannon measure we can address information per-se (and even a
form of proto-meaning)and achieve a measure of what is missing. This
latter advantage opens up another dimension to science. (The apophatic had
been implicitly addressed by thermodynamic entropy, which has hardly ever
been recognized as an apophasis. That's why entropy remains so confusing
to so many!)

The Achilles tendon of Shannon-like measures lies in the underlying
assumption of distinct categories with which to describe the
distributions. The boundaries between categories are often "fuzzy", and,
as you point out, they change with time and growth.

I have been told that mutual information(s) has been defined over fuzzy
sets, but I confess I haven't investigated the advantages of this
extension. As for changing numbers of categories, I note that mutual
information remains well-defined even when the numbers of categories in
the sets being compared are not the same. So I would encourage your
exploration with musical forms.

As to Ashby's metaphor of a hemostat as a machine, my personal preference
is to restrict mechanical analogs for living systems to only those that
are unavoidable. I feel the language of mechanics and mechanisms is
*vastly* overused in biology and draws our attention away from the true
nature of biotic systems.

Thank you for your challenging and astute questions!

Cheers,
Bob

> Dear Bob,
>
> In your Shannon Exonerata paper you have an example of three strings,
> their entropies and their mutual information. I very much admire this
> paper and particularly the critique  of Shannon and the emphasis on the
> apophatic, but some things puzzle me. If these are strings of a living
> thing, then we can assume that these strings grow over time. If sequences
> A,B and C are related, then the growth of one is dependent on the growth
> of the other. This process occurs in time. During the growth of the
> strings, even the determination of what is and is not surprising changes
> with the distinction between what is seen to be the same and what isn't.
>
>  I have begun to think that it's the relative entropy between growing
> things (whether biological measurements, lines of musical counterpoint,
> learning) that matters. Particularly as mutual information is a variety
> of relative entropy. There are dynamics in the interactions. A change in
> entropy for one string with no change in entropy in the others (melody
> and accompaniment) is distinct from everything changing at the same time
> (that's "death and transfiguration"!).
>
> Shannon's formula isn't good at measuring change in entropy. It's less
> good with changes in distinctions which occur at critical moments ("aha! A
> discovery!" Or "this is no longer surprising") The best that we might do,
> I've thought, is segment your strings over time and examine relative
> entropies. I've done this with music. Does anyone have any other
> techniques?
>
> On the apophatic, I can imagine a study of the dynamics of Ashby's
> homeostat where each unit produced one of your strings. The machine comes
> to its solution when the entropies of the dials are each 0 (redundancy 1)
> As the machine approaches its equilibrium, the constraint of each dial on
> every other can be explored by the relative entropies between the dials.
> If we wanted the machine to keep on searching and not settle, it's
> conceivable that you might add more dials into the mechanism as its
> relative entropy started to approach 0. What would this do? It would
> maintain a counterpoint in the relative entropies within the ensemble.
> Would adding the dial increase the apophasis? Or the entropy? Or the
> relative entropy?
>
> Best wishes,
>
> Mark


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-09 Thread Mark Johnson
Dear Bob,

In your Shannon Exonerata paper you have an example of three strings, their 
entropies and their mutual information. I very much admire this paper and 
particularly the critique  of Shannon and the emphasis on the apophatic, but 
some things puzzle me. If these are strings of a living thing, then we can 
assume that these strings grow over time. If sequences A,B and C are related, 
then the growth of one is dependent on the growth of the other. This process 
occurs in time. During the growth of the strings, even the determination of 
what is and is not surprising changes with the distinction between what is seen 
to be the same and what isn't.

 I have begun to think that it's the relative entropy between growing things 
(whether biological measurements, lines of musical counterpoint, learning) that 
matters. Particularly as mutual information is a variety of relative entropy. 
There are dynamics in the interactions. A change in entropy for one string with 
no change in entropy in the others (melody and accompaniment) is distinct from 
everything changing at the same time (that's "death and transfiguration"!). 

Shannon's formula isn't good at measuring change in entropy. It's less good 
with changes in distinctions which occur at critical moments ("aha! A 
discovery!" Or "this is no longer surprising") The best that we might do, I've 
thought, is segment your strings over time and examine relative entropies. I've 
done this with music. Does anyone have any other techniques?

On the apophatic, I can imagine a study of the dynamics of Ashby's homeostat 
where each unit produced one of your strings. The machine comes to its solution 
when the entropies of the dials are each 0 (redundancy 1) As the machine 
approaches its equilibrium, the constraint of each dial on every other can be 
explored by the relative entropies between the dials. If we wanted the machine 
to keep on searching and not settle, it's conceivable that you might add more 
dials into the mechanism as its relative entropy started to approach 0. What 
would this do? It would maintain a counterpoint in the relative entropies 
within the ensemble. Would adding the dial increase the apophasis? Or the 
entropy? Or the relative entropy? 

Best wishes,

Mark 

-Original Message-
From: "Robert E. Ulanowicz" <u...@umces.edu>
Sent: ‎09/‎10/‎2017 15:20
To: "Mark Johnson" <johnsonm...@gmail.com>
Cc: "foundationofinformationscience" <fis@listas.unizar.es>
Subject: Re: [Fis] Data - Reflection - Information


> A perspectival shift can help of the kind that Gregory Bateson once talked
> about. When we look at a hand, do we see five fingers or four spaces?
> Discourses are a bit like fingers, aren't they?

Mark,

The absence of the absent was a major theme of Bateson's, and he
criticized physics for virtually neglecting the missing.

Fortunately, IT is predicated on the missing, and quantitative information
is a double negative (the reverse of what is missing). This makes IT a
prime tool for broadening our scope on reality.
<https://people.clas.ufl.edu/ulan/files/FISPAP.pdf> &
<https://people.clas.ufl.edu/ulan/files/Reckon.pdf>

Bob

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-09 Thread Robert E. Ulanowicz

> A perspectival shift can help of the kind that Gregory Bateson once talked
> about. When we look at a hand, do we see five fingers or four spaces?
> Discourses are a bit like fingers, aren't they?

Mark,

The absence of the absent was a major theme of Bateson's, and he
criticized physics for virtually neglecting the missing.

Fortunately, IT is predicated on the missing, and quantitative information
is a double negative (the reverse of what is missing). This makes IT a
prime tool for broadening our scope on reality.
 &


Bob

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-09 Thread Karl Javorszky
Dear Krassimir,

thank you for undertaking this project of an "Anthology of Contemporary
Ideas on Information".

Let me take up your kind encouragement (not published in fis) and offer the
following contribution (through fis, in order to be able to request the
colleagues to offer their comments: thanks):

(let me hope that the word processor of the fis server will deal correctly
with the formattings used)

The Algorithm of Information Content



The approach we propose needs some introduction, as it makes use of
combinations of techniques which have not been used together.



1) Material we work with

We begin by creating elements of a set.



The set we construct to demonstrate how to establish information content
consists of realisations of the logical sentence *a+b=c. *That is, if we
use *d *distinguishing categories of elements, we shall have a set that
contains the elements *{(1,1), (1,2), (2,2), (1,3), (2,3), (3,3), (1,4),
(2,4), …., (d,d)}. *These elements we refer to as *(a,b), a <=b.*



The number *n *of the elements of the set is of course dependent of *d*, *n
= f(d). * *n = d(d+1)/2.*



While the *principle *of information management is valid over a wide range
of values of *d*, it can be shown (OEIS A242615) that for reasons of
numerical facts, the *efficiency *of information management is the highest
when using *d=16, *which yields *n=136. *Nature also appears to use the
mathematically optimal method of information transmission.



2) What we look into the material 1: properties of the elements

We use the set of elements created such as a kind of Rorschach cards,
looking aspects into them.



We use, next to the traditional aspects *{a, b, c=a+b}* some additional
aspects of *a+b=c *also, namely *{u=b-a, k=2b-a, t=3b-2a, q=2a-b, s=(d+1) -
(a+b), w=3a-2b},* that is, altogether *9 aspects of a+b=c. *



Users are of course free and invited to introduce additional or different
aspects to categorise logical sentences with. The *number* of aspects needs
not to be higher of 8 if they are used in combination (we refer here again
to the facts discussed in OEIS A242615), and as to the *kinds* of aspects:
one is always open to improvements.



3) What we look into the material 2: properties of the set

 We impose sequential orders on the elements of the set, using combinations
of aspects.



We generate *sequencing aspects * by using always 2 of the 9 *primary
aspects, *by creating sequential orders within the set such that each of
the primary aspects is once the *first* and once the *second *ordering
aspect. That is, we sequence the set on the criteria *{ab, ac, ak, au, …,
as, aw, ba, bc, bu, …, bw, ca, cb, …, cw, ka, kb, …, …, wt, ws}. * This
brings forth 72 sequential enumerations of the elements of the set. Of
these, about 20 are actually different. (The inexactitude regarding the
number of identical sequential enumerations has to do with the *sequence*
of the primary aspects and will be of fundamental importance in the course
of the applications of the model.)



The 72 different sequences the elements – of which some are different in
name only -  of the set have been brought into are called the *catalogued
sequences. *These are by no means random  but are as closely related to
each other as aspects of *a+b=c *can be closely related to each other. Each
of the catalogued sequences is equally legitimate and each is an implicated
corollary of *a+b=c, *now having been made explicit (=realised).



4) What we observe within the material 1: logical conflicts

We will not ignore conflicts between place and inhabitant, inhabitant and
place.



It is obvious that 2 different catalogued orders unveil logical conflicts.
If in order *αβ* element *e *is to be found on place *p1* and in order *γδ*
element *e *is to be found on place *p2*, there is apparently a conflict.



The same conflict can also be stated by using the formulation: If in order
*αβ* on place *p *element *e1* is to be found and in order *γδ *on place *p
*element *e2* is to be found, there is apparently a conflict.



We observe potentially or actually conflicting assignments of a sequential
number *{1..n}*  to one and the same element of the set, in dependence of
which of the catalogued orders we deem to be actually the case. As we
decline to entertain an epistemological attitude of human decisions
creating and ceasing logical conflicts, we look into methods of solving
these potential and realised conflicts. As the two orders *αβ* and *γδ*, if
they are different, create dislocations of the elements relative to their
own conceptions of which is the correct place for an individual element to
be in, we speak of *a consolidation of dislocations* that we aim at.



5) What we look into the material 3: series of place changes

We transform linear arrangement *αβ* into linear arrangement *γδ*.



We have observed that if we once order the set sequentially according to
sorting order *αβ* (say, e.g. on arguments: *a,b*), and then 

Re: [Fis] Data - Reflection - Information

2017-10-09 Thread Mark Johnson
Which "information paradigm" is not a discourse framed by the education system? 
The value of the discussion about information - circular though it appears to 
be  - is that we float between discourses. This is a strength. But it is also 
the reason why we might feel we're not getting anywhere!

A perspectival shift can help of the kind that Gregory Bateson once talked 
about. When we look at a hand, do we see five fingers or four spaces? 
Discourses are a bit like fingers, aren't they?

Mark


From: Terrence W. DEACON
Sent: ‎09/

-Original Message-
From: "Terrence W. DEACON" <dea...@berkeley.edu>
Sent: ‎09/‎10/‎2017 01:31
To: "Sungchul Ji" <s...@pharmacy.rutgers.edu>
Cc: "foundationofinformationscience" <fis@listas.unizar.es>
Subject: Re: [Fis] Data - Reflection - Information

Against "meaning"


I think that there is a danger of allowing our anthropocentrism to bias the 
discussion. I worry that the term 'meaning' carries too much of a linguistic 
bias.
By this I mean that it is too attractive to use language as our archtypical 
model when we talk about information.
Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general.
So I am happy to see efforts to bring in topics like music or natural signs 
like thunderstorms and would also want to cast the net well beyond humans to 
include animal calls, scent trails, and molecular signaling by hormones. And it 
is why I am more attracted to Peirce and worried about the use of Saussurean 
concepts.
Words and sentences can indeed provide meanings (as in Frege's Sinn - "sense" - 
"intension") and may also provide reference (Frege's Bedeutung - "reference" - 
"extension"), but I think that it is important to recognize that not all signs 
fit this model. Moreover, 


A sneeze is often interpreted as evidence about someone's state of health, and 
a clap of thunder may indicate an approaching storm.
These can also be interpreted differently by my dog, but it is still 
information about something, even though I would not say that they mean 
something to that interpreter. Both of these phenomena can be said to provide 
reference to something other than that sound itself, but when we use such 
phrases as "it means you have a cold" or "that means that a storm is 
approaching" we are using the term "means" somewhat metaphorically (most often 
in place of the more accurate term "indicates").


And it is even more of a stretch to use this term with respect to pictures or 
diagrams. 
So no one would say the a specific feature like the ears in a caricatured face 
mean something.
Though if the drawing is employed in a political cartoon e.g. with exaggerated 
ears and the whole cartoon is assigned a meaning then perhaps the exaggeration 
of this feature may become meaningful. And yet we would probably agree that 
every line of the drawing provides information contributing to that meaning.


So basically, I am advocating an effort to broaden our discussions and 
recognize that the term information applies in diverse ways to many different 
contexts. And because of this it is important to indicate the framing, whether 
physical, formal, biological, phenomenological, linguistic, etc.
For this reason, as I have suggested before, I would love to have a 
conversation in which we try to agree about which different uses of the 
information concept are appropriate for which contexts. The classic 
syntax-semantics-pragmatics distinction introduced by Charles Morris has often 
been cited in this respect, though it too is in my opinion too limited to the 
linguistic paradigm, and may be misleading when applied more broadly. I have 
suggested a parallel, less linguistic (and nested in Stan's subsumption sense) 
way of making the division: i.e. into intrinsic, referential, and normative 
analyses/properties of information. 


Thus you can analyze intrinsic properties of an informing medium [e.g. Shannon 
etc etc] irrespective of these other properties, but can't make sense of 
referential properties [e.g. what something is about, conveys] without 
considering intrinsic sign vehicle properties, and can't deal with normative 
properties [e.g. use value, contribution to function, significance, accuracy, 
truth] without also considering referential properties [e.g. what it is about].


In this respect, I am also in agreement with those who have pointed out that 
whenever we consider referential and normative properties we must also 
recognize that these are not intrinsic and are interpretation-relative. 
Nevertheless, these are legitimate and not merely subjective or nonscientific 
properti

Re: [Fis] Data - Reflection - Information

2017-10-08 Thread Terrence W. DEACON
Against "meaning"

I think that there is a danger of allowing our anthropocentrism to bias the
discussion. I worry that the term 'meaning' carries too much of a
linguistic bias.
By this I mean that it is too attractive to use language as our archtypical
model when we talk about information.
Language is rather the special case, the most unusual communicative
adaptation to ever have evolved, and one that grows out of and depends on
informationa/semiotic capacities shared with other species and with biology
in general.
So I am happy to see efforts to bring in topics like music or natural signs
like thunderstorms and would also want to cast the net well beyond humans
to include animal calls, scent trails, and molecular signaling by hormones.
And it is why I am more attracted to Peirce and worried about the use of
Saussurean concepts.
Words and sentences can indeed provide meanings (as in Frege's Sinn -
"sense" - "intension") and may also provide reference (Frege's Bedeutung -
"reference" - "extension"), but I think that it is important to recognize
that not all signs fit this model. Moreover,

A sneeze is often interpreted as evidence about someone's state of health,
and a clap of thunder may indicate an approaching storm.
These can also be interpreted differently by my dog, but it is still
information about something, even though I would not say that they mean
something to that interpreter. Both of these phenomena can be said to
provide reference to something other than that sound itself, but when we
use such phrases as "it means you have a cold" or "that means that a storm
is approaching" we are using the term "means" somewhat metaphorically (most
often in place of the more accurate term "indicates").

And it is even more of a stretch to use this term with respect to pictures
or diagrams.
So no one would say the a specific feature like the ears in a caricatured
face mean something.
Though if the drawing is employed in a political cartoon e.g. with
exaggerated ears and the whole cartoon is assigned a meaning then perhaps
the exaggeration of this feature may become meaningful. And yet we would
probably agree that every line of the drawing provides information
contributing to that meaning.

So basically, I am advocating an effort to broaden our discussions and
recognize that the term information applies in diverse ways to many
different contexts. And because of this it is important to indicate the
framing, whether physical, formal, biological, phenomenological,
linguistic, etc.
For this reason, as I have suggested before, I would love to have a
conversation in which we try to agree about which different uses of the
information concept are appropriate for which contexts. The classic
syntax-semantics-pragmatics distinction introduced by Charles Morris has
often been cited in this respect, though it too is in my opinion too
limited to the linguistic paradigm, and may be misleading when applied more
broadly. I have suggested a parallel, less linguistic (and nested in Stan's
subsumption sense) way of making the division: i.e. into intrinsic,
referential, and normative analyses/properties of information.

Thus you can analyze intrinsic properties of an informing medium [e.g.
Shannon etc etc] irrespective of these other properties, but can't make
sense of referential properties [e.g. what something is about, conveys]
without considering intrinsic sign vehicle properties, and can't deal with
normative properties [e.g. use value, contribution to function,
significance, accuracy, truth] without also considering referential
properties [e.g. what it is about].

In this respect, I am also in agreement with those who have pointed out
that whenever we consider referential and normative properties we must also
recognize that these are not intrinsic and are interpretation-relative.
Nevertheless, these are legitimate and not merely subjective or
nonscientific properties, just not physically intrinsic. I am sympathetic
with those among us who want to restrict analysis to intrinsic properties
alone, and who defend the unimpeachable value that we have derived from the
formal foundations that Shannon's original analysis initiated, but this
should not be used to deny the legitimacy of attempting to develop a more
general theory of information that also attempts to discover formal
principles underlying these higher level properties implicit in the
concept.

I take this to be the intent behind Pedro's list. And I think it would be
worth asking for each of his points: Which information paradigm within this
hoierarchy does it assume?

— Terry
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-08 Thread Sungchul Ji
Hi FISers,


Recent discussions on information on this list reminds me of one of the main 
principles of signs advanced by Ferdinand de Saussure (1859-1913)  -- the 
arbitrariness of linguistic signs.  In contrast, Peirce (1839-1914), a 
chemist-turned-logician-philosopher,  seems to have succeeded in capturing the 
universal features of all signs, however fleeting, both linguistic and 
otherwise.


The power and utility of the Peircean definition of signs can be illustrated by 
applying his  triadic definition of signs to the term, 'information', veiwed as 
a sign (having an arbitrariy meaning, according to Saussure).  My impression is 
that all the varied defintions of information discussed on this list (which 
supports the Saussre's principle of the arbitrariness of signs) can be 
organized using the ITR (Irreducible Triadic Relation) diagram embodying the 
Peircean principle of semiotics.   This is done in Figure 1 below, using the 
definition of 'information' that Professor Zhong  recently provided as an 
example.  As you can see, the ITR template has 6 place-holders, 3 nodes and 3 
arrows, which can be populatedf by more than one set of concepts or terms, as 
long as the terms or concepts are consistent with one another and obeys 
well-established laws of physics and logic.


 f g
  Object -->   Sign -->
Interpretant
   (Object Information) (Data)   (Perceived 
Information)
   |
   ^
   |
|
   |
|
   |__|
h

 f = natural process (or information production)
g = mental process or computing (or information interpretation)
h = correspondence (or information flow)


Object = Something referred to by a sign

Sign = Something that stands to someone for something other than itself in
 some context.  Also called ‘representamen’
Interpretant = The effect a sign has  on the mind (or state) of the interpreter
   (human or non-human)

Figure 1.  A suggested definition of ‘information’ based on the triadic 
definition of the sign proposed by Peirce (1839-1914).  The symbol, A --- > B, 
reads as "A determines B', 'A leads to B', ' A is presupposed by B', 'B is 
supervened on A' (http://www.iep.utm.edu/superven), etc.



With all the best.


Sung



From: Fis <fis-boun...@listas.unizar.es> on behalf of 钟义信 <z...@bupt.edu.cn>
Sent: Sunday, October 8, 2017 4:07:53 AM
To: KrassimirMarkov; foundationofinformationscience
Subject: Re: [Fis] Data - Reflection - Information

Dear Krassiir,

The formulars you proposed in your summary is good. May I mention that the 
following formulas will be more precise:

Object Info = External info = Syntactic info = Data
Perceived info = Internal info = Syntactic info + Semantic info + Pragmatic info

In other words, data is also a kind of information - called syntactic 
information, the information without meaning and utility associated. And 
therefore we have a uniform concept of information.

So, the discussions we have last week is very much helpful!

Thank you!

--

Prof. Y. X. Zhong (钟义信)

Center for Intelligence Science Research

University of Posts & Telecommunications

Beijing 100876, China




- 回复邮件 -
发信人:Krassimir Markov <mar...@foibg.com<mailto:mar...@foibg.com>>
收信人:foundationofinformationscience <fis@listas.unizar.es>
时间:2017年10月08日 02时06分15秒
主题:[Fis] Data - Reflection - Information


Dear FIS Colleagues,

It is time for my second post this week.

Many thanks to Christophe Menant (for the profound question) and to all
colleagues (for the very nice and useful comments)!

**

Christophe Menant had written:
“However, I'm not sure that “meaning” is enough to separate information
from data. A basic flow of bits can be considered as meaningless data.
But the same flow can give a meaningful sentence once correctly
demodulated.
I would say that:
1) The meaning of a signal does not exist per se. It is agent dependent.
- A signal can be meaningful information created by an agent (human
voice, ant pheromone).
- A signal can be meaningless (thunderstorm noise).
- A meaning can be generated by an agent receiving the signal
(interpretation/meaning generation).
2) A given signal can generate different meanings when received by
different agents (a thunderstorm noise generates different meanings for
someone walking on the beach or for a person in a house).
3) The domain of efficiency of the me

Re: [Fis] Data - Reflection - Information

2017-10-08 Thread 钟义信
Dear Krassiir,The formulars you proposed in your summary is good. May I mention that the following formulas will be more precise:Object Info = External info = Syntactic info = DataPerceived info = Internal info = Syntactic info + Semantic info + Pragmatic infoIn other words, data is also a kind of information - called syntactic information, the information without meaning and utility associated. And therefore we have a uniform concept of information.So, the discussions we have last week is very much helpful!Thank you!--Prof. Y. X. Zhong (钟义信)Center for Intelligence Science ResearchUniversity of Posts & TelecommunicationsBeijing 100876, China

- 回复邮件 -发信人:Krassimir Markov 收信人:foundationofinformationscience 时间:2017年10月08日 02时06分15秒主题:[Fis] Data - Reflection - InformationDear FIS Colleagues,It is time for my second post this week.Many thanks to Christophe Menant (for the profound question) and to allcolleagues (for the very nice and useful comments)!**Christophe Menant had written: “However, I'm not sure that “meaning” is enough to separate informationfrom data.  A basic flow of bits can be considered as meaningless data.But the same flow can give a meaningful sentence once correctlydemodulated.I would say that:1) The meaning of a signal does not exist per se. It is agent dependent. - A signal can be meaningful information created by an agent (humanvoice, ant pheromone). - A signal can be meaningless (thunderstorm noise). - A meaning can be generated by an agent receiving the signal(interpretation/meaning generation).2) A given signal can generate different meanings when received bydifferent agents (a thunderstorm noise generates different meanings forsomeone walking on the beach or for a person in a house).3) The domain of efficiency of the meaning should be taken into account(human beings, ant-hill).Regarding your positioning of data, I'm not sure to understand your"reflections without meaning".Could you tell a bit more?“Before answering, I need to make a little analysis of posts this weekconnected to my question about data and information. For this goal, belowI shall remember shortly main ideas presented this week.Citations:Stanley N Salthe: “The simple answer to your question about data is to note the word'sderivation from Latin Datum, which can be compared with Factum.”Y. X. Zhong:“It is not difficult to accept that there are two concepts of information,related and also different to each other. The first one is the informationpresented by the objects existed in environment before the subject'sperceiving and the second one is the information perceived and understoodby the subject. The first one can be termed the object information and thesecond one the perceived information. The latter is perceived by thesubject from the former.The object information is just the object's "state of the object and thepattern with which the state varies". No meaning and no utility at thestage.The perceived information is the information, perceive by the subject fromthe object information. So, it should have the form component of theobject (syntactic information), the meaning component of the object(semantic information), and the utility component of the object withrespect to the subject's goal (pragmatic information). Only at this stage,the "meaning" comes out.”Karl Javorszky:“Data is that what we see by using the eyes. Information is that what wedo not see by using the eyes, but we see by using the brain; because it isthe background to that what we see by using the eyes.Data are the foreground, the text, which are put into a context by theinformation, which is the background. In Wittgenstein terms: Sachverhaltand Zusammenhang (which I translate – unofficially – as facts /data/ andcontext /relationships/)”.Dai Griffiths:“I'm curious about your use of the word 'dualistic'. Dualism usuallysuggests that there are two aspects to a single phenomenon. As I interpretyour post, you are saying that information and meaning are separateconcepts. Otherwise, we are led to inquire into the nature of the unity ofwhich they are both aspects, which gets us back where we started.So I interpret 'dualistic' here to mean 'two concepts that are intertwinedin the emergence of events'. Is this parallel to, for example, atomicstructure and fluid dynamics (perhaps there are better examples)? If so,does that imply a hierarchy (i.e. you can have information withoutmeaning, but not meaning without information)? This makes sense to me,though it is not what I usually associate with the word 'dualistic'.”Guy A Hoelzer:“If you start by explicitly stating that you are using the semantic notionof information at the start, I would agree whole heartedly with your post.I claim that physical information is general, while semantic informationis merely a subset of physical information.  Semantic information iscomposed of kinds of physical contrasts to which symbolic meaning has 

Re: [Fis] Data - Reflection - Information

2017-10-07 Thread Karl Javorszky
Dear Krassimir,

Thanks for the excellent summary of the diverse opinions.

Please add to my citation the following sentence :

A numeric approach uses the concept of counting in terms of consolidation
of displacements, and points out the data as a specific element of a cycle,
the information part being the communication about which cycle the element
is part of /= data about the remaining elements /.

Thanks
Karl

Am 07.10.2017 20:07 schrieb "Krassimir Markov" :

> Dear FIS Colleagues,
>
> It is time for my second post this week.
>
> Many thanks to Christophe Menant (for the profound question) and to all
> colleagues (for the very nice and useful comments)!
>
> **
>
> Christophe Menant had written:
>  “However, I'm not sure that “meaning” is enough to separate information
> from data.  A basic flow of bits can be considered as meaningless data.
> But the same flow can give a meaningful sentence once correctly
> demodulated.
> I would say that:
> 1) The meaning of a signal does not exist per se. It is agent dependent.
>  - A signal can be meaningful information created by an agent (human
> voice, ant pheromone).
>  - A signal can be meaningless (thunderstorm noise).
>  - A meaning can be generated by an agent receiving the signal
> (interpretation/meaning generation).
> 2) A given signal can generate different meanings when received by
> different agents (a thunderstorm noise generates different meanings for
> someone walking on the beach or for a person in a house).
> 3) The domain of efficiency of the meaning should be taken into account
> (human beings, ant-hill).
> Regarding your positioning of data, I'm not sure to understand your
> "reflections without meaning".
> Could you tell a bit more?“
>
> Before answering, I need to make a little analysis of posts this week
> connected to my question about data and information. For this goal, below
> I shall remember shortly main ideas presented this week.
>
> Citations:
>
> Stanley N Salthe:
>  “The simple answer to your question about data is to note the word's
> derivation from Latin Datum, which can be compared with Factum.”
>
> Y. X. Zhong:
> “It is not difficult to accept that there are two concepts of information,
> related and also different to each other. The first one is the information
> presented by the objects existed in environment before the subject's
> perceiving and the second one is the information perceived and understood
> by the subject. The first one can be termed the object information and the
> second one the perceived information. The latter is perceived by the
> subject from the former.
> The object information is just the object's "state of the object and the
> pattern with which the state varies". No meaning and no utility at the
> stage.
> The perceived information is the information, perceive by the subject from
> the object information. So, it should have the form component of the
> object (syntactic information), the meaning component of the object
> (semantic information), and the utility component of the object with
> respect to the subject's goal (pragmatic information). Only at this stage,
> the "meaning" comes out.”
>
> Karl Javorszky:
> “Data is that what we see by using the eyes. Information is that what we
> do not see by using the eyes, but we see by using the brain; because it is
> the background to that what we see by using the eyes.
> Data are the foreground, the text, which are put into a context by the
> information, which is the background. In Wittgenstein terms: Sachverhalt
> and Zusammenhang (which I translate – unofficially – as facts /data/ and
> context /relationships/)”.
>
>
> Dai Griffiths:
> “I'm curious about your use of the word 'dualistic'. Dualism usually
> suggests that there are two aspects to a single phenomenon. As I interpret
> your post, you are saying that information and meaning are separate
> concepts. Otherwise, we are led to inquire into the nature of the unity of
> which they are both aspects, which gets us back where we started.
> So I interpret 'dualistic' here to mean 'two concepts that are intertwined
> in the emergence of events'. Is this parallel to, for example, atomic
> structure and fluid dynamics (perhaps there are better examples)? If so,
> does that imply a hierarchy (i.e. you can have information without
> meaning, but not meaning without information)? This makes sense to me,
> though it is not what I usually associate with the word 'dualistic'.”
>
> Guy A Hoelzer:
> “If you start by explicitly stating that you are using the semantic notion
> of information at the start, I would agree whole heartedly with your post.
> I claim that physical information is general, while semantic information
> is merely a subset of physical information.  Semantic information is
> composed of kinds of physical contrasts to which symbolic meaning has been
> attached.  Meaningfulness cannot exist in the absence of physical
> contrast, but physical