(at risk of adding noise to the conversation, since I'm not discussing formal detection of causality in data)
I have recently been thinking about how although the Second Law only applies to closed systems, "something of it" seems to be very relevant for us on Earth (an open system). When entropy as-time's-arrow is discussed, people usually bring up examples like "you can't un-scramble an egg" and "you can't un-break the glass". These examples are of open systems; actually, *very* open systems! An egg is scrambled and a glass broken almost entirely due to an influx of energy from outside. I think what's going on here is roughly this: energy coming into a system will *also* tend to increase entropy, if it isn't coordinated with the system in a fine-tuned way. (On the other hand, energy leaving a system typically decreases entropy.) There is also an easy confusion between entropy and common chaos. Entropy is more about the dispersal of energy in a system. The two examples I mentioned for the arrow of time are more about how matter is arranged. Objects will tend to become tattered and disorderly over time. In principle, it should be possible to reverse this. In practice, though, we most often make *new* objects when we want to decrease disorder. This applies to both organic and inorganic objects. Deciduous trees shed their leaves and allow them to decay, sprouting fresh new leaves to replace them. Factories make new brooms as old ones get thrown away. So, we see an "arrow of time" where objects appear very regular and "fresh" at the beginning of their life and grow increasingly chaotic -- accumulating scars. This concept of "accumulating scars" is interesting because a scar contains some information about the accident which created it. Physics says that we can run time backwards, but intuitively this involves huge coincidences: objects carry scars which are later removed by precise accidents. This arrow of time is different from the 2nd law of thermodynamics, because it applies just as well to open systems. If you start a computer simulation of water with a drop of dye in it, the coloring will first spread out in complex patterns, and then eventually become evenly spread through all the water. Take the same starting system and run the equations backward and the *same thing* will happen. Trying to reverse time does not reverse this kind of phenomenon. My current understanding of time is "the direction of computation". Even if the physical laws are symmetric, you can tell which direction they are being run! The way I see it, there are a number of things that can be said about "what things look like" in an unfolding computation. These phenomena are not as nice and precise as the 2nd law of thermodynamics, but they are what provides our arrow of time. On Tue, Nov 25, 2014 at 3:53 AM, Ben Goertzel <[email protected]> wrote: > > Hmmm... > > Having thought about this more, while I was indeed traveling backwards > in time when I wrote the previous email, it's not too relevant anyhow > because the Second Law only holds globally, and in complex systems > there are many subsystems that are behaving anti-entropically. So I'm > no sure one can use the law of entropy increase to draw conclusions > about local causality. > > However, I was thinking about section 6.3.2 of > > http://cqi.inf.usi.ch/qic/94_Lloyd.pdf > > where Seth Lloyd observes that > > "Having a common effect does not induce correlation between events, > while having a common cause does." > > I.e. > > -- In the case of two causes with a common effect ... there is an > increase of information from past to future (the probability spread > across two causes is now concentrated on a single effect). There no > correlation in the past (between the causes). This is the opposite > direction of the Second Law of Thermodynamics. > > -- In the case of two effects with a common cause ... there is a > decrease of information from past to future (the probability > concentrated in one cause is now spread across two effects). There > is correlation in the future (between the effects). This is in the > direction of the Second Law of Thermodynamics. > > ... > > I.e. in many cases the direction of causal influence may be > identifiable as the direction of increasing correlation.... I'm not > sure exactly what are the limits of this conclusion though. > > ... > > Soo -- What if one has two sets of variables, S and T, and there is > significant mutual information between the values of S and the values > of T, as evaluated across different cases...? So, suppose we have > both > > S --> T > > and > > T --> S > > in a sense.... But, if there is significantly more correlation > among the variables within T, than among the variables within S, then > we can say that it's more likely that T is the effect and S is the > cause... > > The asymmetry used to identify causation is then one of correlation > rather than of temporality directly... > > This may be a way of heuristically inferring causality from > non-temporal data, if one has a sufficient ensemble of data samples... > > -- Ben > > > On Tue, Nov 25, 2014 at 1:46 PM, Ben Goertzel <[email protected]> wrote: > > > > Hmm, maybe you're right , maybe I was traveling backwards in time when I > > wrote that ... > > > > (More later) > > > > On Tuesday, November 25, 2014, martin biehl <[email protected]> wrote: > >> > >> hm, sounds interesting, but I don't get it either. If entropy increases, > >> the uncertainty of the state increases and information (about the state) > >> decreases as you say, but why would the past then contain more > information > >> about the future than vice versa? Let X be the past, Y be the future, > then > >> as mutual information is symmetric: > >> H(X) - H(X|Y) = H(Y) - H(Y|X) > >> now H(Y) > H(X) because of entropy increase. > >> then > >> H(Y|X) > H(X|Y) > >> and the future should be more uncertain given the past than vice versa. > >> Where did this go wrong? > >> > >> > >> On Tue, Nov 25, 2014 at 2:13 AM, Ben Goertzel via AGI <[email protected]> > >> wrote: > >>> > >>> Information is negentropy, so increase of entropy implies decrease of > >>> information... > >>> > >>> Acquiring information about a system is associated with entropy > >>> production... > >>> > >>> On Tue, Nov 25, 2014 at 9:59 AM, Aaron Nitzkin <[email protected]> > >>> wrote: > >>> > Sorry, I must be a little confused -- probably thinking from the > wrong > >>> > perspective . . . I would think that there is more information > >>> > in the future about the past than vice versa, because we know more > >>> > about the > >>> > past than we do about the future, and also, doesn't > >>> > increase in entropy imply increase in information (because it > requries > >>> > more > >>> > information to specify the configuration of a system > >>> > with higher entropy than the same system with lower entropy?) > >>> > > >>> > On Tue, Nov 25, 2014 at 8:27 AM, Ben Goertzel <[email protected]> > wrote: > >>> >> > >>> >> In the early part of the paper, the author clarifies that while he > >>> >> assumes "temporal precedence as an aspect of causality" for > >>> >> simplicity, actually his approach would work with any other > systematic > >>> >> way of assigning asymmetric directions to relationships between > events > >>> >> > >>> >> I have been thinking a lot about how to infer causality from > >>> >> non-time-series data (e.g. categorial gene expression data), and > this > >>> >> is a case where looking at some other sort of asymmetry than > temporal > >>> >> precedence (but that may generally correlated with temporal > >>> >> precedence) seems to make sense. E.g. I've been thinking about > >>> >> looking at informational asymmetry: If one has P(A = a | B=b), one > can > >>> >> look at whether the distribution for A gives more information about > >>> >> the distribution for B, or vice versa. This informational > asymmetry > >>> >> can be used similarly to temporal asymmetry in defining causality. > >>> >> Furthermore, it on the average is going to correlate with temporal > >>> >> asymmetry, because the past tends to contain more information about > >>> >> the future than vice versa (due to entropy increase, roughly > >>> >> speaking... but there's more story here...) > >>> >> > >>> >> -- Ben > >>> >> > >>> >> > >>> >> On Tue, Nov 25, 2014 at 5:34 AM, Michael van der Gulik > >>> >> <[email protected]> wrote: > >>> >> > "Chapter 1. Quantum mechanics... " > >>> >> > > >>> >> > It's a nice article; I'll add it to my reading list. Prediction > >>> >> > involves > >>> >> > working out what causes what, so it's pretty fundamental. > >>> >> > > >>> >> > I have a question. Causation in my mind seems to always involve > >>> >> > time, > >>> >> > and I > >>> >> > suspect it's impossible to have causation without including > timing. > >>> >> > So... > >>> >> > > >>> >> > Is it possible for a cause to happen at exactly the same moment as > >>> >> > its > >>> >> > effect? > >>> >> > > >>> >> > Is it possible for a cause to happen after its effect? > >>> >> > > >>> >> > One instance I'm trying to get my head around is when an > >>> >> > intelligence > >>> >> > anticipates a cause (which is an event in the future), which > results > >>> >> > in > >>> >> > the > >>> >> > intelligence acting such that the effect occurs before the cause. > >>> >> > Perhaps > >>> >> > the anticipation itself is the causal event. > >>> >> > > >>> >> > Regards, > >>> >> > Michael. > >>> >> > > >>> >> > > >>> >> > On Sun, Nov 23, 2014 at 7:17 AM, Ben Goertzel <[email protected]> > >>> >> > wrote: > >>> >> >> > >>> >> >> I just happened across this 2011 paper on the probabilistic > >>> >> >> foundation > >>> >> >> of causality, > >>> >> >> > >>> >> >> http://philsci-archive.pitt.edu/9729/1/Website_Version_2.pdf > >>> >> >> > >>> >> >> which seems to carefully clarify a bunch of issues that remain > >>> >> >> dangling in prior discussions of the topic > >>> >> >> > >>> >> >> It seems to give a good characterization of what it means for "P > to > >>> >> >> appear to cause Q, based on the knowledge-base of observer O" > >>> >> >> > >>> >> >> -- > >>> >> >> Ben Goertzel, PhD > >>> >> >> http://goertzel.org > >>> >> >> > >>> >> >> "The reasonable man adapts himself to the world: the unreasonable > >>> >> >> one > >>> >> >> persists in trying to adapt the world to himself. Therefore all > >>> >> >> progress depends on the unreasonable man." -- George Bernard Shaw > >>> >> >> > >>> >> >> -- > >>> >> >> You received this message because you are subscribed to the > Google > >>> >> >> Groups > >>> >> >> "Artificial General Intelligence" group. > >>> >> >> To unsubscribe from this group and stop receiving emails from it, > >>> >> >> send > >>> >> >> an > >>> >> >> email to > >>> >> >> [email protected]. > >>> >> >> For more options, visit https://groups.google.com/d/optout. > >>> >> > > >>> >> > > >>> >> > > >>> >> > > >>> >> > -- > >>> >> > http://gulik.pbwiki.com/ > >>> >> > > >>> >> > -- > >>> >> > You received this message because you are subscribed to the Google > >>> >> > Groups > >>> >> > "Artificial General Intelligence" group. > >>> >> > To unsubscribe from this group and stop receiving emails from it, > >>> >> > send > >>> >> > an > >>> >> > email to > >>> >> > [email protected]. > >>> >> > For more options, visit https://groups.google.com/d/optout. > >>> >> > >>> >> > >>> >> > >>> >> -- > >>> >> Ben Goertzel, PhD > >>> >> http://goertzel.org > >>> >> > >>> >> "The reasonable man adapts himself to the world: the unreasonable > one > >>> >> persists in trying to adapt the world to himself. Therefore all > >>> >> progress depends on the unreasonable man." -- George Bernard Shaw > >>> >> > >>> >> -- > >>> >> You received this message because you are subscribed to the Google > >>> >> Groups > >>> >> "opencog" group. > >>> >> To unsubscribe from this group and stop receiving emails from it, > send > >>> >> an > >>> >> email to [email protected]. > >>> >> To post to this group, send email to [email protected]. > >>> >> Visit this group at http://groups.google.com/group/opencog. > >>> >> For more options, visit https://groups.google.com/d/optout. > >>> > > >>> > > >>> > >>> > >>> > >>> -- > >>> Ben Goertzel, PhD > >>> http://goertzel.org > >>> > >>> "The reasonable man adapts himself to the world: the unreasonable one > >>> persists in trying to adapt the world to himself. Therefore all > >>> progress depends on the unreasonable man." -- George Bernard Shaw > >>> > >>> > >>> ------------------------------------------- > >>> AGI > >>> Archives: https://www.listbox.com/member/archive/303/=now > >>> RSS Feed: > >>> https://www.listbox.com/member/archive/rss/303/10872673-8f99760d > >>> Modify Your Subscription: > >>> > https://www.listbox.com/member/?& > >>> Powered by Listbox: http://www.listbox.com > >> > >> > > > > > > -- > > Ben Goertzel, PhD > > http://goertzel.org > > > > "The reasonable man adapts himself to the world: the unreasonable one > > persists in trying to adapt the world to himself. Therefore all progress > > depends on the unreasonable man." -- George Bernard Shaw > > > > > > -- > Ben Goertzel, PhD > http://goertzel.org > > "The reasonable man adapts himself to the world: the unreasonable one > persists in trying to adapt the world to himself. Therefore all > progress depends on the unreasonable man." -- George Bernard Shaw > > -- > You received this message because you are subscribed to the Google Groups > "opencog" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > To post to this group, send email to [email protected]. > Visit this group at http://groups.google.com/group/opencog. > For more options, visit https://groups.google.com/d/optout. > -- Abram Demski Blog: http://lo-tho.blogspot.com/ ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
