Dear all, Just to comment on the discussion after Terrence's apt cautionary words...
The various notions of information are partially a linguistic confusion, partially a relic of multiple conceptual histories colliding, and partially an ongoing negotiation (or even a war, to state it less creditably and with less civility), about the future of the term as a (more or less unified) scientific concept. To latch onto that negotiation, let me propose that an evolutionary approach to information can capture and explain some of that ambiguous multiplicity in terminology, by showing how pre-biotic natural processes developed feedback loops and material encoding techniques - which was a type of localised informational emergence - and how life, in developing cellular communication, DNA, sentience, memory, and selfhood, rarified this process further, producing informational processing such that had never existed before. Was it the same information? Or was it something new? Human consciousness and cultural semiosis are a yet higher level adaptation of information, and computer A.I. is something else entirely, for - at least for now - it lacks feelings and self-awareness and thus "meaning" in the human sense. But it computes, stores and processes. It might even develop suprasentience whose structure we cannot fathom based on our limited human perspective. Is it still the same type of information? Or something different? Is evolution in quality (emergence) or only in quantity (continuous development)? I generally take the Peircean view that signification (informative relationality) evolves, and information, as an offshoot of that, is thus a multi-stage process - EVEN if it has a simple and predictable elemental substructure (composed of say, 1s and 0s, or quarks and bosons). Information might thus not only have a complex history of emergence, but also an unknown future, composed of various leaps in cosmic organization. In ignorant wonder, all the best, Otto Lehto, philosopher, political economist, PhD student at King's College London, webpage: www.ottolehto.com, cellphone: +358-407514748 On Mar 28, 2017 23:24, "Terrence W. DEACON" <dea...@berkeley.edu> wrote: > Corrected typos (in case the intrinsic redundancy didn't compensate for > these minor corruptions of the text): > > information-beqaring medium = information-bearing medium > > appliction = application > > conceptiont = conception > > On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON <dea...@berkeley.edu> > wrote: > >> Dear FIS colleagues, >> >> I agree with John Collier that we should not assume to restrict the >> concept of information to only one subset of its potential applications. >> But to work with this breadth of usage we need to recognize that >> 'information' can refer to intrinsic statistical properties of a physical >> medium, extrinsic referential properties of that medium (i.e. content), and >> the significance or use value of that content, depending on the context. A >> problem arises when we demand that only one of these uses should be given >> legitimacy. As I have repeatedly suggested on this listserve, it will be a >> source of constant useless argument to make the assertion that someone is >> wrong in their understanding of information if they use it in one of these >> non-formal ways. But to fail to mark which conception of information is >> being considered, or worse, to use equivocal conceptions of the term in the >> same argument, will ultimately undermine our efforts to understand one >> another and develop a complete general theory of information. >> >> This nominalization of 'inform' has been in use for hundreds of years in >> legal and literary contexts, in all of these variant forms. But there has >> been a slowly increasing tendency to use it to refer to the >> information-beqaring medium itself, in substantial terms. This reached its >> greatest extreme with the restricted technical usage formalized by Claude >> Shannon. Remember, however, that this was only introduced a little over a >> half century ago. When one of his mentors (Hartley) initially introduced a >> logarithmic measure of signal capacity he called it 'intelligence' — as in >> the gathering of intelligence by a spy organization. So had Shannon chose >> to stay with that usage the confusions could have been worse (think about >> how confusing it would have been to talk about the entropy of >> intelligence). Even so, Shannon himself was to later caution against >> assuming that his use of the term 'information' applied beyond its >> technical domain. >> >> So despite the precision and breadth of appliction that was achieved by >> setting aside the extrinsic relational features that characterize the more >> colloquial uses of the term, this does not mean that these other uses are >> in some sense non-scientific. And I am not alone in the belief that these >> non-intrinsic properties can also (eventually) be strictly formalized and >> thereby contribute insights to such technical fields as molecular biology >> and cognitive neuroscience. >> >> As a result I think that it is legitimate to argue that information (in >> the referential sense) is only in use among living forms, that an alert >> signal sent by the computer in an automobile engine is information (in both >> senses, depending on whether we include a human interpreter in the loop), >> or that information (in the intrinsic sense of a medium property) is lost >> within a black hole or that it can be used to provide a more precise >> conceptiont of physical cause (as in Collier's sense). These different uses >> aren't unrelated to each other. They are just asymmetrically dependent on >> one another, such that medium-intrinsic properties can be investigated >> without considering referential properties, but not vice versa. >> >> It's time we move beyond terminological chauvenism so that we can further >> our dialogue about the entire domain in which the concept of information is >> important. To succeed at this, we only need to be clear about which >> conception of information we are using in any given context. >> >> — Terry >> >> >> >> >> >> On Tue, Mar 28, 2017 at 8:32 PM, John Collier <colli...@ukzn.ac.za> >> wrote: >> >>> I wrote a paper some time ago arguing that causal processes are the >>> transfer of information. Therefore I think that physical processes can and >>> do convey information. Cause can be dispensed with. >>> >>> >>> >>> - There is a copy at Causation is the Transfer of Information >>> <http://web.ncf.ca/collier/papers/causinf.pdf> In Howard Sankey (ed) >>> *Causation, >>> Natural Laws and Explanation* (Dordrecht: Kluwer, 1999) >>> >>> >>> >>> Information is a very powerful concept. It is a shame to restrict >>> oneself to only a part of its possible applications. >>> >>> >>> >>> John Collier >>> >>> Emeritus Professor and Senior Research Associate >>> >>> Philosophy, University of KwaZulu-Natal >>> >>> http://web.ncf.ca/collier >>> >>> >>> >>> _______________________________________________ >>> Fis mailing list >>> Fis@listas.unizar.es >>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis >>> >>> >> >> >> -- >> Professor Terrence W. Deacon >> University of California, Berkeley >> > > > > -- > Professor Terrence W. Deacon > University of California, Berkeley > > _______________________________________________ > Fis mailing list > Fis@listas.unizar.es > http://listas.unizar.es/cgi-bin/mailman/listinfo/fis > >
_______________________________________________ Fis mailing list Fis@listas.unizar.es http://listas.unizar.es/cgi-bin/mailman/listinfo/fis