[Fis] RV: Something positive (From Arturo Tozzi)

2016-12-22 Thread PEDRO CLEMENTE MARIJUAN FERNANDEZ
De: tozziart...@libero.it [tozziart...@libero.it]
Enviado el: jueves, 22 de diciembre de 2016 14:08
Para: fis@listas.unizar.es; PEDRO CLEMENTE MARIJUAN FERNANDEZ
Asunto: Something positive

Dear FISers,

it's excruciating...
We did not even find an unique definition of information, life, brain activity, 
consciousness...
How could the science improve, if it lacks definitions of what itself is 
talking about?
And the old problem of science: from above, or from below?  Which is the best 
approach?

It seems that we depicted a rather dark, hopeless picture...  However, there 
is, I think, a light in front of us.
The only way to pursue our common goal, I think, it is to be free.
Free from our own beliefs.
Enlarge our horizons to other fields of science, apart from our own.
Forget metaphysics, of course.
Look at other disciplines, such as physics, medicine, engineering, biology, 
math...

Voltaire said: "Il faut cultiver notre jardin" .  But he was wrong.  We have to 
take care of more than a garden.
Your own garden is too narrow for your beautiful mind.

Therefore, TANTI AUGURI!
And I hope that, the next year, in the 2017 Christmas time, every one of us 
will be expert in a scientific field different from his own.

Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/

-
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Seasons greetings

2016-12-22 Thread Andrew Fingelkurts / BM-Science
Dear members of FIS,

 

A new year is coming, and we hope that 2017 will be a fruitful year, full of
happiness, new exciting ideas, new experiments and nice results, and new
great times for you!

 

Merry Christmas and Happy New Year 2017!

 

Greetings,

Andrew and Alexander

__ 
Dr. Andrew Fingelkurts, Ph.D.
Co-Head of Research 

cid:image001.gif@01CA9858.F290A470

BM-Science - Brain & Mind Technologies Research Centre
PL 77
FI-02601 Espoo, FINLAND
 
Tel. +358 9 541 4506 
 
andrew.fingelku...@bm-science.com
 
www.bm-science.com/team/fingelkurts.html

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-22 Thread Dai Griffiths
>  Information is not “something out there” which “exists” otherwise 
than as our construct.


I agree with this. And I wonder to what extent our problems in 
discussing information come from our desire to shoe-horn many different 
phenomena into the same construct. It would be possible to disaggregate 
the construct. It be possible to discuss the topics which we address on 
this list without using the word 'information'. We could discuss 
redundancy, variety, constraint, meaning, structural coupling, 
coordination, expectation, language, etc.


In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our 
discussions might become (even more) remote from everyday human 
experience. But many scientific discussions are remote from everyday 
human experience.


Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms of 
bits of information.


Alternatively: the expected information content /H /of a probability 
distribution is .


/H/is further defined as probabilistic entropy using Gibb’s 
formulation of the entropy .


This definition of information is an operational definition. In my 
opinion, we do not need an essentialistic definition by answering the 
question of “what is information?” As the discussion on this list 
demonstrates, one does not easily agree on an essential answer; one 
can answer the question “how is information defined?” Information is 
not “something out there” which “exists” otherwise than as our construct.


Using essentialistic definitions, the discussion tends not to move 
forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) 
definition of information “as natural selection assembling the very 
constraints on the release of energy that then constitutes work and 
the propagation of organization.” I asked several times what this 
means and how one can measure this information. Hitherto, I only 
obtained the answer that colleagues who disagree with me will be 
cited. JAnother answer was that “counting” may lead to populism. J


Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net ; 
http://www.leydesdorff.net/
Associate Faculty, SPRU, University of 
Sussex;


Guest Professor Zhejiang Univ. , 
Hangzhou; Visiting Professor, ISTIC, 
Beijing;


Visiting Professor, Birkbeck , University of 
London;


http://scholar.google.com/citations?user=ych9gNYJ=en

*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net
*Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of information 
stated below.


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


I agree.  I struggled with this definition for a long time before 
realising that Shannon was really discussing "amount of information" 
or the number of bits needed to convey a message.  He was looking for 
a formula that would provide an accurate estimate of the number of 
bits needed to convey a message and realised that the amount of 
information (number of bits) needed to convey a message was dependent 
on the "amount" of uncertainty that had to be eliminated and so he 
equated these.


It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of 
water in liters, but this does not tell us what water is and likewise 
the measure we use for "amount of information" does not tell us what 
information is. We can, for example equate the amount of water needed 
to fill a container with the volume of the container, but we should 
not think that water is therefore identical to an empty volume.  
Similarly we should not think that information is identical to 
uncertainty.


By equating the number of bits needed to convey a message with the 
"amount of uncertainty" that has to be eliminated Shannon, in effect, 
equated opposites so that he could get an estimate of the number of 
bits needed to eliminate the uncertainty.  We should not therefore 
consider that this equation establishes what information is.


Dick

On 18 December 2016 at 15:05, Loet Leydesdorff > wrote:


Dear James and colleagues,

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


2. "In particular,