Interesting papers. I have a few remarks, but no time right now. I heartily 
agree with your general point.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Hector Zenil [mailto:hzen...@gmail.com]
Sent: Wednesday, 29 March 2017 11:00 AM
To: Terrence W. DEACON <dea...@berkeley.edu>
Cc: fis <fis@listas.unizar.es>
Subject: Re: [Fis] Causation is transfer of information

With all due respect, I am still amazed how it is so much ignored and neglected 
all the science and math around information developed in the last 50-60 years! 
With most people here citing in the best case only Shannon Entropy but 
completely neglecting and ignoring algorithmic complexity, logical depth, 
quantum information and so on. Your philosophical discussions are quite empty 
if most people ignore the progress that computer science and math has done in 
the last 60 years! Please take it constructively. This should be a shame for 
the whole field of Philosophy of Information and FIS.

Perhaps I can help alleviate this a little even if I feel wrong pointing you 
out to my own papers on subjects relevant to philosophical discussion:

http://www.hectorzenil.net/publications.html

They do care about the meaning and value of information beyond Shannon Entropy. 
For example, paper J21:

- Natural Scene Statistics Mediate the Perception of Image Complexity 
(available online at 
http://www.tandfonline.com/doi/abs/10.1080/13506285.2014.950365 also available 
pdf preprint in the arxiv)

and

- Rare Speed-up in Automatic Theorem Proving Reveals Tradeoff Between 
Computational Time and Information Value (https://arxiv.org/abs/1506.04349).

And we even show how Entropy fails at the most basic level:

Low Algorithmic Complexity Entropy-deceiving Graphs 
(https://arxiv.org/abs/1608.05972)

Best Regards,

Hector Zenil

-------
This email and any files transmitted with it are confidential and intended 
solely for the use of the individual or entity to whom they are addressed. If 
you have received this email in error please notify the sender and delete the 
message.

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
<dea...@berkeley.edu<mailto:dea...@berkeley.edu>> wrote:
>
> Dear FIS colleagues,
>
> I agree with John Collier that we should not assume to restrict the concept 
> of information to only one subset of its potential applications. But to work 
> with this breadth of usage we need to recognize that 'information' can refer 
> to intrinsic statistical properties of a physical medium, extrinsic 
> referential properties of that medium (i.e. content), and the significance or 
> use value of that content, depending on the context.  A problem arises when 
> we demand that only one of these uses should be given legitimacy. As I have 
> repeatedly suggested on this listserve, it will be a source of constant 
> useless argument to make the assertion that someone is wrong in their 
> understanding of information if they use it in one of these non-formal ways. 
> But to fail to mark which conception of information is being considered, or 
> worse, to use equivocal conceptions of the term in the same argument, will 
> ultimately undermine our efforts to understand one another and develop a 
> complete general theory of information.
>
> This nominalization of 'inform' has been in use for hundreds of years in 
> legal and literary contexts, in all of these variant forms. But there has 
> been a slowly increasing tendency to use it to refer to the 
> information-beqaring medium itself, in substantial terms. This reached its 
> greatest extreme with the restricted technical usage formalized by Claude 
> Shannon. Remember, however, that this was only introduced a little over a 
> half century ago. When one of his mentors (Hartley) initially introduced a 
> logarithmic measure of signal capacity he called it 'intelligence' — as in 
> the gathering of intelligence by a spy organization. So had Shannon chose to 
> stay with that usage the confusions could have been worse (think about how 
> confusing it would have been to talk about the entropy of intelligence). Even 
> so, Shannon himself was to later caution against assuming that his use of the 
> term 'information' applied beyond its technical domain.
>
> So despite the precision and breadth of appliction that was achieved by 
> setting aside the extrinsic relational features that characterize the more 
> colloquial uses of the term, this does not mean that these other uses are in 
> some sense non-scientific. And I am not alone in the belief that these 
> non-intrinsic properties can also (eventually) be strictly formalized and 
> thereby contribute insights to such technical fields as molecular biology and 
> cognitive neuroscience.
>
> As a result I think that it is legitimate to argue that information (in the 
> referential sense) is only in use among living forms, that an alert signal 
> sent by the computer in an automobile engine is information (in both senses, 
> depending on whether we include a human interpreter in the loop), or that 
> information (in the intrinsic sense of a medium property) is lost within a 
> black hole or that it can be used  to provide a more precise conceptiont of 
> physical cause (as in Collier's sense). These different uses aren't unrelated 
> to each other. They are just asymmetrically dependent on one another, such 
> that medium-intrinsic properties can be investigated without considering 
> referential properties, but not vice versa.
>
> It's time we move beyond terminological chauvenism so that we can further our 
> dialogue about the entire domain in which the concept of information is 
> important. To succeed at this, we only need to be clear about which 
> conception of information we are using in any given context.
>
> — Terry
>
>
>
>
>
> On Tue, Mar 28, 2017 at 8:32 PM, John Collier 
> <colli...@ukzn.ac.za<mailto:colli...@ukzn.ac.za>> wrote:
>>
>> I wrote a paper some time ago arguing that causal processes are the transfer 
>> of information. Therefore I think that physical processes can and do convey 
>> information. Cause can be dispensed with.
>>
>>
>>
>> There is a copy at Causation is the Transfer of Information In Howard Sankey 
>> (ed) Causation, Natural Laws and Explanation (Dordrecht: Kluwer, 1999)
>>
>>
>>
>> Information is a very powerful concept. It is a shame to restrict oneself to 
>> only a part of its possible applications.
>>
>>
>>
>> John Collier
>>
>> Emeritus Professor and Senior Research Associate
>>
>> Philosophy, University of KwaZulu-Natal
>>
>> http://web.ncf.ca/collier
>>
>>
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es<mailto:Fis@listas.unizar.es>
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>
>
>
> --
> Professor Terrence W. Deacon
> University of California, Berkeley
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es<mailto:Fis@listas.unizar.es>
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to