Dear Mark, Pedro and FIS Colleagues,

It is nice to talk about our common (future!) understanding of the phenomenon 
“information.

Dear Mark,

Congratulations for your very good book (Burgin, 2010).  
It is proper example what we all need to do step by step.

Thank you for pointing my work on GIT. Small misprinting at the page 12
It is written:
“..., Markov, et al (2007) write that when a triad
(source, evidence, recipient)
exists, then the reflection of the first entity in the second one is called
information. Thus, information is interpreted as a specific reflection.”

The triad need to be written
(source, recipient : evidence)

No problems, the main meaning is clear.

Now about your questions. 

In general, there are no answers to these questions. 
It is impossible to answer without pointing the paradigm to which the questions 
and answers belong to.
We may stay at the point of view of GTI (Burgin, 2010) and to try to answer. 
However, we need to study in deep GTI, to understand and after that to try to 
answer the questions.
The same is for any other theory, for instance GIT (Markov, et al, 2007).

What to do?

Dear Pedro,
I am novice in FIS Group and maybe I do not know the accepted by FIS colleagues 
style of work in such difficult situations. Please help me.

What I can do at this moment is to sketch my answers without deep explanation, 
which may be done in further discussion.

My point of view is just the triad given above. 
The forth element is not included in it – the Subject (Infological System, 
Information Subject, INFOS).

This means that we have quadruple 
( source, recipient : evidence, Infos )

Now, I will answer the questions in the reverse order:

3.  Is it necessary/useful/reasonable to make a distinction between information 
and an information carrier?
 

One and the same source may be reflected in many different recipients, for 
which may exists again many evidences and Infoses. It means that as more 
different recipients and etc. we have
so different information(s) about source will exists. The information is 
reflection IN the information carrier and destroying the carrier leads to 
loosing current reflection of the source.  

Now the answer:
YES,
Is it necessary/useful/reasonable to make a distinction between information and 
the information carrier, taking in account that the information is reflection 
in the carrier but not whole carrier.



2. Are there types or kinds of information that are not encompassed by the 
general theory of information (GTI)? 

Here the answer is simple:
NO.
The reason is in the definition of information in the frame of GTI (Burgin, 
2010) . The other definitions lead to different types of information.



1.  Is it necessary/useful/reasonable to make a strict distinction between 
information as a phenomenon and information measures as quantitative or 
qualitative characteristics of information?

This is the most difficult question and I need more deep explanation for the 
answer.

To measure means to have at least one measurement system. 
Again the variety is so great that it is impossible to answer simply.

What we really may measure concerning the information phenomena? 
The source, the recipient, the evidence or the Infos characteristics (features)?
Again, without concrete paradigm, there is no answer.

In GIT (Markov, et al, 2007), we introduce the concept “Information 
Expectation” of the Infos as point in the multi-dimensional subjective ( mental 
! ) attributive space of Infos. 
(Mark, do you remember the “ideal objects” and their materialization ?)

The information, subjectively received by Infos, is another point in the same 
space. 

In this case we may introduce any measurement system, in which we may measure 
some characteristics of subjective reflection (information).

Finally the answer is:
YES,
Is it necessary/useful/reasonable to make a strict distinction between 
information as a phenomenon and information measures as quantitative or 
qualitative characteristics of information,
because the information as phenomenon cover all instances of the quadruple 
“(source, recipient : evidence, Infos)”, but the information measures are 
closely depended on concrete instance of the quadruple, i.e. on concrete 
quadruple elements. 

Of course, there exist possibility to define measurement systems for classes of 
instances of the quadruple, for example see Shannon, C. E. (1993).


Sorry to be so talkative :-)

Friendly regards

Krassimir 


Source:
(Markov, et al, 2007) Markov, K., Ivanova, K. and Mitov, I. Basic structure of 
the general information theory, Information Theories and Applications, v. 14, 
2007. pp. 5–19
http://www.foibg.com/ijita/vol14/ijita14-1-p01.pdf 








From: Pedro C. Marijuan 
Sent: Friday, April 08, 2011 11:08 AM
To: fis@listas.unizar.es 
Subject: [Fis] ON INFORMATION THEORY--Mark Burgin

Discussion session on information theory:




INFORMATION: MYSTERY SOLVING

Mark Burgin
Professor & Visiting Scholar
Department of Mathematics
University of California at Los Angeles
http://www.math.ucla.edu/~mburgin/
mbur...@math.ucla.edu



 

On the one hand, information is the basic phenomenon of our world. We live in 
the world where information is everywhere. All knowledge is possible only 
because we receive, collect and produce information. People discovered 
existence of information and now talk of information is everywhere in our 
society. As Barwise and Seligman write (1997), in recent years, information 
became all the rage. The reason is that people are immersed in information, 
they cannot live without information and they are information systems 
themselves. The whole life is based on information processes as Loewenstein 
convincingly demonstrates in his book (1999). Information has become a key 
concept in sociology, political science, and the economics of the so-called 
information society. Thus, to better understand life, society, technology and 
many other things, we need to know what information is and how it behaves. 
Debons and Horne write (1997), “if information science is to be a science of 
information, then some clear understanding of the object in question requires 
definition.”

On the other hand, the actual nature and essence of the information, as well as 
of knowledge produced and distributed by information technology, remains 
abstract and actually unknown to the majority of people. Even more, many 
researchers assume that the diversity of information types and uses forms an 
insurmountable obstacle to creation of a unified comprehensible information 
theory. For instance, Shannon (1993) wrote: “It is hardly to be expected that a 
single concept of information would satisfactorily account for the numerous 
possible applications of this general field.” Other researchers, such as 
Goffman (1970) and Gilligan (1994), argued that the term information has been 
used in so many different and sometimes incommensurable ways, forms and 
contexts that it is not even worthwhile to elaborate a single conceptualization 
achieving general agreement. Capurro, Fleissner, and Hofkirchner (1999) even 
give an informal proof of the, so-called, Capurro trilemma that implies 
impossibility of a comprising concept of information. According to his 
understanding, information may mean the same at all levels (univocity), or 
something similar (analogy), or something different (equivocity). In the first 
case, we lose all qualitative differences, as for instance, when we say that 
e-mail and cell reproduction are the same kind of information process. Not only 
the ”stuff” and the structure but also the processes in cells and computer 
devices are rather different from each other. If we say the concept of 
information is being used analogically, then we have to state what the 
“original” meaning is. If it is the concept of information at the human level, 
then we are confronted with anthropomorphisms if we use it at a non-human 
level. We would say that “in some way” atoms “talk” to each other, etc. 
Finally, there is equivocity, which means that information cannot be a unifying 
concept any more, i.e., it cannot be the basis for the new paradigm…

The Capurro trilemma is a valid scientific result if it is assumed that 
researchers tried to elaborate a definition of information in the traditional 
form. Indeed, in this case, the trilemma clearly explains and grounds why it is 
impossible to achieve a comprising definition of information.

At the same time, utilization of a new type of definition, which is called a 
parametric definition, made it possible to adequately and comprehensively 
define information and build its unifying theory called the general theory of 
information (GTI) (Burgin, 2010). 

Parametric systems (parametric curves, parametric equations, parametric 
functions, etc.) have been frequently used in mathematics and its applications 
for a long time. For instance, a parametric curve in a plane is defined by two 
functions f(t) and g(t), while a parametric curve in space has the following 
form: (f(t), g(t), h(t)) where parameter t takes values in some interval of 
real numbers.

Parameters used in mathematics and science are, as a rule, only numerical and 
are considered as quantities that define certain characteristics of systems. 
For instance, in probability theory, the normal distribution has the mean m and 
the standard deviation s as parameters. A more general parameter, functional, 
is utilized for constructing families of non-Diophantine arithmetics (Burgin, 
1997; 2001).  

In the case of the general theory of information (GTI), the parameter is even 
more general. The parametric definition of information utilizes a system 
parameter. Namely, an infological system plays the role of a parameter that 
discerns different kinds of information, e.g., social, personal, chemical, 
biological, genetic, or cognitive, and combines all existing kinds and types of 
information in one general concept “information”.

This parametric approach provides tool for building the general theory of 
information as a synthetic approach, which organizes and encompasses all main 
directions in information theory (Burgin, 2010). On the meta-axiomatic level, 
it is formulated as system of principles, explaining what information is (by 
means of Ontological Principles) and how to measure information (by means of 
Axiological Principles). On the level of science, mathematical model of 
information are constructed. One type of these models bases the mathematical 
stratum of the general theory of information on category theory (Burgin, 
2010a). Abstract categories allow us to develop flexible models for information 
and its flow, as well as for computers, networks and computation. Another type 
of models establishes functional representation of infological systems 
representing information as an operator in functional spaces. Namely, a Banach 
or Hilbert space serves as the state space of an infological system. Then 
transformations of infological systems are mathematically modeled by operators 
in Banach/Hilbert spaces (Burgin, 2010).

Taking into account the current situation and active quest for a unified theory 
of information (UTI) (Hofkirchner, 1999), it is natural to suggest the 
following questions for the discussion, answers to which may clarify the 
current situation in information theory and pave the way to new achievements in 
this area: 

<!--[if !supportLists]-->1.                <!--[endif]-->Is it 
necessary/useful/reasonable to make a strict distinction between information as 
a phenomenon and information measures as quantitative or qualitative 
characteristics of information?

<!--[if !supportLists]-->2.                <!--[endif]-->Are there types or 
kinds of information that are not encompassed by the general theory of 
information (GTI)? 

<!--[if !supportLists]-->3.                <!--[endif]-->Is it 
necessary/useful/reasonable to make a distinction between information and an 
information carrier?

 

Primary source:
Burgin, M. Theory of Information: Fundamentality, Diversity and Unification, 
New York/London/Singapore: World Scientific, 2010 

Additional sources:
Burgin, M. (2003) Information Theory: A Multifaceted Model of Information, 
Entropy, 5(2), pp. 146-160

Burgin, M. (2003a) Information: Problem, Paradoxes, and Solutions, TripleC, v. 
1(1), pp. 53-70

Burgin, M. (2010a) Information Operators in Categorical Information Spaces, 
Information, v. 1, No.1, pp. 119-152 

Capurro, R., Fleissner, P., and Hofkirchner, W. (1999) Is a Unified Theory of 
Information Feasible? In The Quest for a unified theory of information, 
Proceedings of the 2nd International Conference on the Foundations of 
Information Science, pp. 9-30 

Hofkirchner, W. (Ed.) (1999) The Quest for a Unified Theory of Information, 
Proceedings of the Second International Conference on the Foundations of 
Information Science, Gordon and Breach Publ.

Marijuán, P.C. (2009) The Advancement of Information Science, TripleC, v. 7(2), 
pp. 369-375              

Shannon, C. E. (1993) Collected Papers, (N. J. A. Sloane and A. D. Wyner, Eds) 
IEEE Press, New York


---------------------------------------------------------


 

 



--------------------------------------------------------------------------------
_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis
_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to