BTW to interject:

I noticed here, this guy - Downarowicz - has a pretty good grasp on entropy, 
not sure who sent me this link for the paper but it's quite rewarding:

For Entropy definitions:
http://prac.im.pwr.wroc.pl/~downar/english/documents/entropy.pdf

and 

this is the good one:
http://arxiv.org/abs/1110.5201

John

> -----Original Message-----
> From: Jim Bromer via AGI [mailto:[email protected]]
> Sent: Sunday, December 21, 2014 2:52 PM
> To: AGI
> Subject: Re: [agi] Re: [opencog-dev] Re: Probabilistic analysis of causality
> 
> On Fri, Dec 19, 2014 at 4:42 PM, Matt Mahoney via AGI <[email protected]>
> wrote:
> > I realize there are some programs that are reversible, for example {A
> > -> B, B -> A}. In this case, entropy remains constant.
> 
> Significantly, the 'entropy' of a computer (program) is relative to some
> 'frames of reference', like whether the previous state of a variable is
> retrievable.
> 
> > A computation taking input is equivalent to a computation whose input
> > is part of its initial state.
> 
> But again, the 'entropy' is relevant to some 'frames of reference' so your 
> last
> statement must be an over-generalization. This isn't a quibble. (Or maybe,
> from another angle, it is a quibble, but it is a good one.) You are using a
> defeasible notion of logical equivalence to produce a conclusion about
> 'entropy'! It is like a perfect storm of an inappropriate use of logical
> equivalence. Even if a program that reacts to input is equivalent to a program
> with that input as part of an initial state, how could it be true that the 
> entropy
> of the programs are equivalent? You can't dismiss the potential for the
> entropy of input as a mere representation of a particular path reproduced in
> a closed program.
> 
> But even ignoring the question of whether the frame of reference of looking
> for sources of 'entropy' might override some equivalence argument, what
> about the question of a closed computation that looks for solutions to
> equations where some solution searches might produce chaotic evaluations.
> You cannot say that the entropy of the solution is decreased by an excursion
> into chaos because there may be more such excursions. For a strong
> example, if the 'equation' might itself be chaotic then the search for a
> solution to some 'equations' may not produce reductions in 'entropy' via any
> iteration.
> 
> Extrapolations of a concept like 'entropy' is ok as long as you make a real
> effort to examine the limitations of such extrapolations realistically.
> Jim Bromer
> 
> 
> On Fri, Dec 19, 2014 at 4:42 PM, Matt Mahoney via AGI <[email protected]>
> wrote:
> > On Fri, Dec 19, 2014 at 2:13 PM, Jim Bromer <[email protected]>
> wrote:
> >> On Thu, Dec 18, 2014 at 9:52 PM, Matt Mahoney via AGI
> <[email protected]> wrote:
> >>> On Wed, Dec 17, 2014 at 1:27 PM, Abram Demski via AGI
> <[email protected]> wrote:
> >>>> My current understanding of time is "the direction of computation".
> >>>
> >>> That is actually quite precise. The entropy of a computer can only
> >>> decrease. In a state transition diagram, states can merge but not
> >>> fork. Operations like writing a bit of memory cannot be reversed
> >>> because the previous bit was erased.
> >>
> >> That reasoning is confused. The reference to a state transition
> >> diagram is not directly related to the discussion up until that
> >> point,
> >
> > Suppose a finite state machine has 2 possible states, A and B. The
> > program is {A -> B, B -> B}. The initial state can either be A or B.
> > The next state will be B. The entropy of the computer (the number of
> > bits you need to describe the state) has gone from 1 to 0.
> >
> > If the computer is in state B, you do not know the previous state. You
> > cannot run the program backward.
> >
> > The same argument applies to Turing machines. Writing a symbol to the
> > tape is not reversible in general because the previous symbol is
> > erased.
> >
> > I realize there are some programs that are reversible, for example {A
> > -> B, B -> A}. In this case, entropy remains constant.
> >
> >> and I don't believe anyone was talking about a computer program that
> >> could not respond to input.
> >
> > A computation taking input is equivalent to a computation whose input
> > is part of its initial state.
> >
> > --
> > -- Matt Mahoney, [email protected]
> >
> >
> > -------------------------------------------
> > AGI
> > Archives: https://www.listbox.com/member/archive/303/=now
> > RSS Feed:
> > https://www.listbox.com/member/archive/rss/303/24379807-653794b5
> > Modify Your Subscription: https://www.listbox.com/member/?&; Powered
> by
> > Listbox: http://www.listbox.com
> 
> 
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/248029-
> 82d9122f
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> -35e0de32
> Powered by Listbox: http://www.listbox.com




-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to