Martin,

Hmmm -- This is "explaining away" as discussed in the Bayes Nets
literature, right?

https://en.wikipedia.org/wiki/Multivariate_mutual_information#Example_of_Negative_Multivariate_mutual_information

The examples on that wiki page are ones where common cause corresponds
to positive MMI, and common effect corresponds to negative MMI.  I
wonder how generally that holds?

ben

On Wed, Nov 26, 2014 at 12:35 AM, martin biehl <[email protected]> wrote:
> "Having a common effect does not induce correlation between events,
> while having a common cause does."
>
> is possibly not always true, take the infamous XOR gate, here if the output
> (effect) is known to be 1 then this implies a correlation between the two
> inputs i.e. they must be opposites.
>
> is this a counterexample?
>
> On Tue, Nov 25, 2014 at 11:53 AM, Ben Goertzel <[email protected]> wrote:
>>
>> Hmmm...
>>
>> Having thought about this more, while I was indeed traveling backwards
>> in time when I wrote the previous email, it's not too relevant anyhow
>> because the Second Law only holds globally, and in complex systems
>> there are many subsystems that are behaving anti-entropically.  So I'm
>> no sure one can use the law of entropy increase to draw conclusions
>> about local causality.
>>
>> However, I was thinking about section 6.3.2 of
>>
>> http://cqi.inf.usi.ch/qic/94_Lloyd.pdf
>>
>> where Seth Lloyd observes that
>>
>> "Having a common effect does not induce correlation between events,
>> while having a common cause does."
>>
>> I.e.
>>
>> -- In the case of two causes with a common effect ... there is an
>> increase of information from past to future (the probability spread
>> across two causes is now concentrated on a single effect).   There no
>> correlation in the past (between the causes).   This is the opposite
>> direction of the Second Law of Thermodynamics.
>>
>> -- In the case of two effects with a common cause ...  there is a
>> decrease of information from past to future (the probability
>> concentrated in one cause is now spread across two effects).   There
>> is correlation in the future (between the effects).  This is in the
>> direction of the Second Law of Thermodynamics.
>>
>> ...
>>
>> I.e. in many cases the direction of causal influence may be
>> identifiable as the direction of increasing correlation....   I'm not
>> sure exactly what are the limits of this conclusion though.
>>
>> ...
>>
>> Soo --   What if one has two sets of variables, S and T, and there is
>> significant mutual information between the values of S and the values
>> of T, as evaluated across different cases...?   So, suppose we have
>> both
>>
>> S --> T
>>
>> and
>>
>> T --> S
>>
>> in a sense....    But, if there is significantly more correlation
>> among the variables within T, than among the variables within S, then
>> we can say that it's more likely that T is the effect and S is the
>> cause...
>>
>> The asymmetry used to identify causation is then one of correlation
>> rather than of temporality directly...
>>
>> This may be a way of heuristically inferring causality from
>> non-temporal data, if one has a sufficient ensemble of data samples...
>>
>> -- Ben
>>
>>
>> On Tue, Nov 25, 2014 at 1:46 PM, Ben Goertzel <[email protected]> wrote:
>> >
>> > Hmm, maybe you're right , maybe I was traveling backwards in time when I
>> > wrote that ...
>> >
>> > (More later)
>> >
>> > On Tuesday, November 25, 2014, martin biehl <[email protected]> wrote:
>> >>
>> >> hm, sounds interesting, but I don't get it either. If entropy
>> >> increases,
>> >> the uncertainty of the state increases and information (about the
>> >> state)
>> >> decreases as you say, but why would the past then contain more
>> >> information
>> >> about the future than vice versa? Let X be the past, Y be the future,
>> >> then
>> >> as mutual information is symmetric:
>> >> H(X) - H(X|Y) = H(Y) - H(Y|X)
>> >> now H(Y) > H(X) because of entropy increase.
>> >> then
>> >> H(Y|X) > H(X|Y)
>> >> and the future should be more uncertain given the past than vice versa.
>> >> Where did this go wrong?
>> >>
>> >>
>> >> On Tue, Nov 25, 2014 at 2:13 AM, Ben Goertzel via AGI <[email protected]>
>> >> wrote:
>> >>>
>> >>> Information is negentropy, so increase of entropy implies decrease of
>> >>> information...
>> >>>
>> >>> Acquiring information about a system is associated with entropy
>> >>> production...
>> >>>
>> >>> On Tue, Nov 25, 2014 at 9:59 AM, Aaron Nitzkin <[email protected]>
>> >>> wrote:
>> >>> > Sorry, I must be a little confused -- probably thinking from the
>> >>> > wrong
>> >>> > perspective . . . I would think that there is more information
>> >>> > in the future about the past than vice versa, because we know more
>> >>> > about the
>> >>> > past than we do about the future, and also, doesn't
>> >>> > increase in entropy imply increase in information (because it
>> >>> > requries
>> >>> > more
>> >>> > information to specify the configuration of a system
>> >>> > with higher entropy than the same system with lower entropy?)
>> >>> >
>> >>> > On Tue, Nov 25, 2014 at 8:27 AM, Ben Goertzel <[email protected]>
>> >>> > wrote:
>> >>> >>
>> >>> >> In the early part of the paper, the author clarifies that while he
>> >>> >> assumes "temporal precedence as an aspect of causality" for
>> >>> >> simplicity, actually his approach would work with any other
>> >>> >> systematic
>> >>> >> way of assigning asymmetric directions to relationships between
>> >>> >> events
>> >>> >>
>> >>> >> I have been thinking a lot about how to infer causality from
>> >>> >> non-time-series data (e.g. categorial gene expression data), and
>> >>> >> this
>> >>> >> is a case where looking at some other sort of asymmetry than
>> >>> >> temporal
>> >>> >> precedence (but that may generally correlated with temporal
>> >>> >> precedence) seems to make sense.   E.g. I've been thinking about
>> >>> >> looking at informational asymmetry: If one has P(A = a | B=b), one
>> >>> >> can
>> >>> >> look at whether the distribution for A gives more information about
>> >>> >> the distribution for B, or vice versa.   This informational
>> >>> >> asymmetry
>> >>> >> can be used similarly to temporal asymmetry in defining causality.
>> >>> >> Furthermore, it on the average is going to correlate with temporal
>> >>> >> asymmetry, because the past tends to contain more information about
>> >>> >> the future than vice versa (due to entropy increase, roughly
>> >>> >> speaking... but there's more story here...)
>> >>> >>
>> >>> >> -- Ben
>> >>> >>
>> >>> >>
>> >>> >> On Tue, Nov 25, 2014 at 5:34 AM, Michael van der Gulik
>> >>> >> <[email protected]> wrote:
>> >>> >> > "Chapter 1. Quantum mechanics... "
>> >>> >> >
>> >>> >> > It's a nice article; I'll add it to my reading list. Prediction
>> >>> >> > involves
>> >>> >> > working out what causes what, so it's pretty fundamental.
>> >>> >> >
>> >>> >> > I have a question. Causation in my mind seems to always involve
>> >>> >> > time,
>> >>> >> > and I
>> >>> >> > suspect it's impossible to have causation without including
>> >>> >> > timing.
>> >>> >> > So...
>> >>> >> >
>> >>> >> > Is it possible for a cause to happen at exactly the same moment
>> >>> >> > as
>> >>> >> > its
>> >>> >> > effect?
>> >>> >> >
>> >>> >> > Is it possible for a cause to happen after its effect?
>> >>> >> >
>> >>> >> > One instance I'm trying to get my head around is when an
>> >>> >> > intelligence
>> >>> >> > anticipates a cause (which is an event in the future), which
>> >>> >> > results
>> >>> >> > in
>> >>> >> > the
>> >>> >> > intelligence acting such that the effect occurs before the cause.
>> >>> >> > Perhaps
>> >>> >> > the anticipation itself is the causal event.
>> >>> >> >
>> >>> >> > Regards,
>> >>> >> > Michael.
>> >>> >> >
>> >>> >> >
>> >>> >> > On Sun, Nov 23, 2014 at 7:17 AM, Ben Goertzel <[email protected]>
>> >>> >> > wrote:
>> >>> >> >>
>> >>> >> >> I just happened across this 2011 paper on the probabilistic
>> >>> >> >> foundation
>> >>> >> >> of causality,
>> >>> >> >>
>> >>> >> >> http://philsci-archive.pitt.edu/9729/1/Website_Version_2.pdf
>> >>> >> >>
>> >>> >> >> which seems to carefully clarify a bunch of issues that remain
>> >>> >> >> dangling in prior discussions of the topic
>> >>> >> >>
>> >>> >> >> It seems to give a good characterization of what it means for "P
>> >>> >> >> to
>> >>> >> >> appear to cause Q, based on the knowledge-base of observer O"
>> >>> >> >>
>> >>> >> >> --
>> >>> >> >> Ben Goertzel, PhD
>> >>> >> >> http://goertzel.org
>> >>> >> >>
>> >>> >> >> "The reasonable man adapts himself to the world: the
>> >>> >> >> unreasonable
>> >>> >> >> one
>> >>> >> >> persists in trying to adapt the world to himself. Therefore all
>> >>> >> >> progress depends on the unreasonable man." -- George Bernard
>> >>> >> >> Shaw
>> >>> >> >>
>> >>> >> >> --
>> >>> >> >> You received this message because you are subscribed to the
>> >>> >> >> Google
>> >>> >> >> Groups
>> >>> >> >> "Artificial General Intelligence" group.
>> >>> >> >> To unsubscribe from this group and stop receiving emails from
>> >>> >> >> it,
>> >>> >> >> send
>> >>> >> >> an
>> >>> >> >> email to
>> >>> >> >> [email protected].
>> >>> >> >> For more options, visit https://groups.google.com/d/optout.
>> >>> >> >
>> >>> >> >
>> >>> >> >
>> >>> >> >
>> >>> >> > --
>> >>> >> > http://gulik.pbwiki.com/
>> >>> >> >
>> >>> >> > --
>> >>> >> > You received this message because you are subscribed to the
>> >>> >> > Google
>> >>> >> > Groups
>> >>> >> > "Artificial General Intelligence" group.
>> >>> >> > To unsubscribe from this group and stop receiving emails from it,
>> >>> >> > send
>> >>> >> > an
>> >>> >> > email to
>> >>> >> > [email protected].
>> >>> >> > For more options, visit https://groups.google.com/d/optout.
>> >>> >>
>> >>> >>
>> >>> >>
>> >>> >> --
>> >>> >> Ben Goertzel, PhD
>> >>> >> http://goertzel.org
>> >>> >>
>> >>> >> "The reasonable man adapts himself to the world: the unreasonable
>> >>> >> one
>> >>> >> persists in trying to adapt the world to himself. Therefore all
>> >>> >> progress depends on the unreasonable man." -- George Bernard Shaw
>> >>> >>
>> >>> >> --
>> >>> >> You received this message because you are subscribed to the Google
>> >>> >> Groups
>> >>> >> "opencog" group.
>> >>> >> To unsubscribe from this group and stop receiving emails from it,
>> >>> >> send
>> >>> >> an
>> >>> >> email to [email protected].
>> >>> >> To post to this group, send email to [email protected].
>> >>> >> Visit this group at http://groups.google.com/group/opencog.
>> >>> >> For more options, visit https://groups.google.com/d/optout.
>> >>> >
>> >>> >
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>> Ben Goertzel, PhD
>> >>> http://goertzel.org
>> >>>
>> >>> "The reasonable man adapts himself to the world: the unreasonable one
>> >>> persists in trying to adapt the world to himself. Therefore all
>> >>> progress depends on the unreasonable man." -- George Bernard Shaw
>> >>>
>> >>>
>> >>> -------------------------------------------
>> >>> AGI
>> >>> Archives: https://www.listbox.com/member/archive/303/=now
>> >>> RSS Feed:
>> >>> https://www.listbox.com/member/archive/rss/303/10872673-8f99760d
>> >>> Modify Your Subscription:
>> >>>
>> >>> https://www.listbox.com/member/?&;
>> >>> Powered by Listbox: http://www.listbox.com
>> >>
>> >>
>> >
>> >
>> > --
>> > Ben Goertzel, PhD
>> > http://goertzel.org
>> >
>> > "The reasonable man adapts himself to the world: the unreasonable one
>> > persists in trying to adapt the world to himself. Therefore all progress
>> > depends on the unreasonable man." -- George Bernard Shaw
>> >
>>
>>
>>
>> --
>> Ben Goertzel, PhD
>> http://goertzel.org
>>
>> "The reasonable man adapts himself to the world: the unreasonable one
>> persists in trying to adapt the world to himself. Therefore all
>> progress depends on the unreasonable man." -- George Bernard Shaw
>
>



-- 
Ben Goertzel, PhD
http://goertzel.org

"The reasonable man adapts himself to the world: the unreasonable one
persists in trying to adapt the world to himself. Therefore all
progress depends on the unreasonable man." -- George Bernard Shaw


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to