Re: [Fis] FW: The Information Flow

2012-11-20 Thread Karl Javorszky
Step Two of *Learn to Count in Twelve Easy Steps*

*What happened previously:*

Step 1.:

We have introduced additional describing aspects of the logical sentence
a+b=c. Next to a,b,c, we also make use of u=b-a, k=b-2a, t=2b-3a, q=a-2b,
s=17-(a+b|c), w=2a-3b. For a graphical presentation, see:
http://32o2m99e.utawebhost.at/index.php?option=com_contentview=articleid=130:01engraphcatid=9:angollang=en

Discussion of Step 1:

Joe from Switzerland writes : [this approach]...  potentially quite
dangerous. Alfred Korzybski (*Science and *Society) had an easy theory of
the mind that a high-school student could learn, and it led to
scientology.

Answer:

Leaving the orthodox way can well end in sectarian extremism. The approach
of the Twelve Easy Steps is insofar subversive that it disobeys Teacher’s
instruction: “Thou shalt not look into (a1-b1)-(a2-b2) if a1+b1=c=a2+b2 and
a1#a2”. Where this might end is indeed unpredictable.

*Step 2:*

Today we introduce the set of additions we shall use. We generate the 136
smallest pairs of a,b and their aspects {a,b,c,k,u,t,q,s,w}.

Reason why:

We demonstrate properties of the individual before the background of the
multitude. To be able to do so, we need a multitude. This is the reason for
which we create the multitude.

Why not less:

We see that Nature uses two sets of information carriers that come both in
triplets of four units. Therefore, we need 4 basic units. We see that the
basis of counting is related to the expression 2*i**2, and this gives
2*4**2=32.

Why not more:

We shall introduce the terms “sequential” and “contemporary” in Step 6. We
shall see in Step 10 that congruence between sequenced and contemporaneous
states will become inexact above n=136.

Data set:

The data set we use can be downloaded from:
http://32o2m99e.utawebhost.at/index.php?option=com_atrendezview=table1lang=en

Remark: The column “Permutáció” (permutation) shows the sequence of the
arguments used at the creation of the table. Its necessity will be
discussed in Step 10. Presently: disregard.


2012/11/19 Robert Ulanowicz u...@umces.edu

 Quoting John Collier colli...@ukzn.ac.za:

 As I have tried to argue above, to avoid reductionism in reality as
 opposed to in logic and mathematics I think we need the additional
 condition of dissipation (what I call nonHamiltonian mechanics
 elsewhere -- the usual condition of conservation breaks down due to
 the loss of free energy to the system).

 John,

 Your point underscores my earlier one. Dissipation is emblematic of
 entropic processes -- which make ours an open world. There's no
 wishing that away!

 Bob

 ___
 fis mailing list
 fis@listas.unizar.es
 https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] FW: The Information Flow

2012-11-19 Thread John Collier


Dear folks,
Overall, I agree with Gordana. I have one, perhaps very large, correction
though. I will go through this bit by bit (no pun intended). I have been
planning to jump into this discussion, but a visit here (UFBA) by
Stewart Newman has kept me busy. He gave lots of examples of
non-computable processes in development in showing that developmental
units could be retained through changes in both function and genes. Gerd
Muller had told me this almost ten years ago, but Stewart has some
remarkable new cases, some of which have implications for evolution
and phylogeny. 

At 12:54 PM 2012/11/19, Bruno Marchal wrote:
Dear Joe, 
On 19 Nov 2012, at 12:26,
joe.bren...@bluewin.ch
wrote:

Dear Bruno, Gordana and All,
What I am resisting is any form of numerical-computational
totalitarianism. 
Joe, if you have an idea of an effective procedure that is not captured
by the Turing machine, recursive function model, and you can make clear
why it doesn't then I will listen to that complaint. Turing did, in fact,
give a idea of how this might work in unpublished papers of his. He was
also the originator of diffusion-reaction processes (I would not call
them mechanisms, but they are commonly called so in the literature, since
it has become fashionable to call processes that are not mechanical in
the classical sense mechanisms). This sort of process is fundamental to
Stuart's arguments about the non-reducibility of differences in
developmental units to either genetic differences or to functional
differences. Note, in passing, the role of differences here. If
information is a difference that makes a difference, then we are talking
about types of information here. Actual systems that realized Turing's
mathematics were only created some time later, with the Brusselator (an
autocatalytic reaction devised by Prigogine, but it is more) in the
material form of the

Belousov–Zhabotinsky reaction (BZ reaction) from the 1950s, but they
are very common in nature, such as in forming the stripes on a zebra, or
the correct end of a severed hydra.
I have some remarks about computations to make next, which agree in
general with Gordana's remarks, but add another element that I think is
essential.
Computationalism would be
totalitarian if there was a computable universal programs for the total
computable functions, and only them. But it is easy to prove that such a
total universal machine cannot exist.
This sort of machine is often called an 'oracle'. Greg Chaitin, who
originated algorithmic information theory independently of Kolmogorov,
did a construction of a number omega (my email programme won't take the
Greek capital letter). If we knew that number, then we could tell whether
any program halts. We could also give the first n digits of the product
of any program if we knew the first n digits of omega (assuming the same
programming language). The digits of omega, though, are provably random,
which means that there is no program shorter than n that can compute the
first n digits of the product for every program. So there are no oracles.
We can get this from the unsolvability of the halting problem, but
Chaitin's approach shows what an oracle would have to look like.

The price of having universal
machine is that it will be a partial machine, undefined on many
arguments. Then such machine can be shown to be able to defeat all
reductionist or totalitarian theories about their own behavior.
That is why I say that computationalism is a vaccine against
totalitarianism or reductionism.
This is where I part company. There are two notions of computation. The
first is the sense of computation. The first and most widely used is that
of an algorithm (specifically a Knuth algorithm). It is equivalent to a
program that halts, for all intents. There are programs that don't halt,
which are partial functions: there is no algorithm that gives a value for
every input. However, a program that could compute such a function (an
oracle) would be able to compute the first n, for any n, numbers. It just
can't compute them all. For example, the three body problem is known to
be uncomputable. However, with a computer powerful enough, and some
epsilon of some value, we could computer the later state of the system to
within epsilon for any arbitrary finite time t.
What we can't do is solve reaction diffusion systems. Why is that? It is
because dissipation is an essential part of their dynamics -- they need
an energy input to work (this is true of all dissipative systems in the
Prigogine sense -- I say Prigogine sense since there are systems that
self-organize through dissipation that are not dissipative systems, they
are only self-reorganizing, not spontaneously self-organizing -- some
people confuse the two, or are careless about distinguishing them, such
as Stuart Kauffman). In any case, to reach a steady state in finite time
(e.g., the BZ reaction oscillations) they do the noncomputable in finite
time. Harmonics in the Solar System, such as the1-1 

Re: [Fis] FW: The Information Flow

2012-11-19 Thread Robert Ulanowicz
Quoting John Collier colli...@ukzn.ac.za:

As I have tried to argue above, to avoid reductionism in reality as  
opposed to in logic and mathematics I think we need the additional  
condition of dissipation (what I call nonHamiltonian mechanics  
elsewhere -- the usual condition of conservation breaks down due to  
the loss of free energy to the system).

John,

Your point underscores my earlier one. Dissipation is emblematic of  
entropic processes -- which make ours an open world. There's no  
wishing that away!

Bob

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Fw: The Information Flow

2012-10-27 Thread Joseph Brenner
Dear Pedro,

I and I am sure most of us are grateful when you open up the debate in this 
way. To go farther, though, people must be ready to ask many questions about 
familiar concepts such as the following:

1. Are there serious alternatives to Aristotelian causality?
2. Is it possible to combine insights from Heraclitus and Parmenides to get 
the advantages of both in complex domains? 
3. Can non-mechanistic thought be expressed in sufficiently rigorous terms to 
avoid slipping into non-sense and non-science?
4. In reply to your Why?, can an explanation of the refusal of people to 
accept the necessary changes in mind-set be related to genetic + environmental 
factors that also determine other doubtful polarizations (like voting for 
Romney-Ryan) or criminal behavior?

As I have tried to express them in this forum from time to time in relation to 
other issues, my answer to all the above questions is yes. But it takes new 
work and a new attitude. As a personal example, I have asked about 12 (!) 
mathematicians to help me express the calculus of implications of my Logic in 
Reality in alternative, more familiar terms. None has either done so nor said 
that it is not possible. 

As another example, after some effort, the first article in proper English by 
Wu Kun of the Institute for the Philosophy of Information in Xi'An has just 
been published on-line in Information. People who assume, however, that his 
view of philosophy is not of critical importance to information science are 
making just the kind of mistake Pedro tells us to avoid! It is a 
metaphilosophy, a recasting of the underlying assumptions of scientific - 
natural and social - thought in informational terms. I urge you all to look at 
it.

Best wishes,

Joseph 

- Original Message - 
From: PEDRO CLEMENTE MARIJUAN FERNANDEZ 
To: fis@listas.unizar.es 
Sent: Friday, October 26, 2012 10:32 PM
Subject: Re: [Fis] The Information Flow


Dear FISers,

Is it interesting the discussion on whether those informational entities 
contain realizations of the Aristotelian scheme of causality or not? 

The cell, in my view, conspicuously fails --it would be too artifactual an 
scheme. Some parts of the sensory paths of advanced nervous systems seem to 
separate some of those causes --but only in a few parts or patches of the 
concerned pathway. For instance, in visual processing the what and the 
how/where seem to be travelling together undifferentiated along the optic 
nerve and are separated --more or less-- after the visual superior colliculus 
in the midbrain before discharging onto the visual cortex. The really big flow 
of spikes arriving each instant (many millions every few milisec) are mixed and 
correlated with themselves and with other top-down and bottom-up preexisting 
flows in multiple neural mappings... and further, when those flows mix with the 
association areas under the influence of language, then, and only then, all 
those logic and conceptual categorizations of human thought are enacted in the 
ephemeral synaptic networks. 

I am optimistic that  a new Heraclitean way of thinking boils down in network 
science, neuroinformatics, systems biology, bioinformation etc. Neither the 
Parmenidean eliminative fixism of classical reductionists, nor the 
Aristotelian organicism of systemicists. Say that this is a caricature. However 
you cannot bathe twice in the same river not just because we all are caught 
into the universal physical flow of photons and forces, but for the 
Heraclitean flux of our own neurons and brains, for the inner torrents of the 
aggregated information flows. The same for whatever cells, societies, etc. and 
their physical structures for info transportation. 

Either we produce an interesting new vision of the world, finally making sense 
of those perennial metaphors among the different (informational) realms, or 
information science will continue to be that small portion of incoherent 
patches more or less close to information theory or to artificial intelligence. 
In spite of decades of bla-bla- about information revolution and information 
society and tons of ad hoc literature, the educated thought of our contemporary 
society continues to be deeply mechanistic! 

Why?

best wishes

---Pedro


 
 -snip-
 
 I think it of some interest that I have 
 previously ( 2006  On
 Aristotle’s conception of causality.  
 General Systems Bulletin 35:
 11.) proposed that the Aristotelian 'formal 
 cause' determines both
 'what happens' and 'how it happens', and that 
 the combination of
 this with material cause ('what it happens 
 to') delivers 'where' it
 happens.
 
 (For completeness sake I add that efficient 
 cause determines only
 'when it happens', while final cause points 
 to 'why it happens'.  It
 would be quite exciting to find that these 
 informations were also
 carried on separate tracts.)
 
 
 It would be exciting, as that would seem to refute the 
 Aristotelean 

[Fis] FW: The Information Flow (From John Collier)

2012-10-22 Thread Pedro C. Marijuan
*From:* John Collier
*Sent:* 21 October 2012 11:22 PM
*To:* fis
*Subject:* Re: [Fis] The Information Flow

At 06:14 PM 2012/10/21, Stanley N Salthe wrote:
Pedro -- it is of interest to me that

On Sun, Oct 21, 2012 at 3:38 PM, PEDRO CLEMENTE MARIJUAN FERNANDEZ 
pcmarijuan.i...@aragon.es wrote:

Dear FISers,

Continuing with the comments on the how versus the what, it
is an important topic in mammalian (vertebrate) nervous
systems. They are subtended by mostly separate neural tracts
(though partially interconnected), it is the dorsal stream,
specialized in the how  where, and the ventral stream stream
about the what.


-snip-

I think it of some interest that I have previously ( 2006  On
Aristotle’s conception of causality.  General Systems Bulletin 35:
11.) proposed that the Aristotelian 'formal cause' determines both
'what happens' and 'how it happens', and that the combination of
this with material cause ('what it happens to') delivers 'where' it
happens.

(For completeness sake I add that efficient cause determines only
'when it happens', while final cause points to 'why it happens'.  It
would be quite exciting to find that these informations were also
carried on separate tracts.)


It would be exciting, as that would seem to refute the Aristotelean idea 
of the four causes as four aspects of all causation. However an 
information channel can carry some part of the information from its 
source, which would be a sort of filter or abstraction of the source. 
So, for example, a channel might be sensitive only to the how, but not 
the what, and vice versa. A channel is fundamentally a mapping of 
classes from a source to a sink that through instances that retain the 
mapping (see Barwsie and Seligman, Information Flow: The Logic of 
Distributed Systems). So in this case, a channel sensitive to, say, 
what, would retain the what classifications of the source in a way 
that the sink could use, but perhaps not any other information. The 
channels themselves could still maintain all four aspects of 
Aristotelean causation, so Aristotle need not be refuted. This would 
still be very interesting, though. I am unclear what functional 
advantage there would be, though we certainly manage to separate these 
causes in much of our thinking (perhaps even, we can't help it).

Cheers,
John

=== Please find our Email Disclaimer here--: 
http://www.ukzn.ac.za/disclaimer ===
___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis