John-Thanks for the references to the papers. I quickly skimmed the first one and I am very interested in studying it a little more carefully.
My example of finding a solution to an equation wasn't a great example. However, as far as I can tell, the entropy of a system must be related to some task. To say that the entropy of a computer program must decrease as the program changes bit-states is to ignore that a relation of the program to some task might be more significant. I can't tell if my other criticism about the relevance of (bit-state) logical equivalence argument in an general entropy argument is sound or not. I think it is. (The question is whether the logical equivalence between a program that reacts to input and a program that declares the input as part of its initial state is reasonable in a discussion about the entropy of a computer.) The reason is again that the entropy of a system must be related to some task. Since we who believe that AGI is feasible feel that a computer program can learn to work with and solve new problems, then the potential tasks that might be learned would be lost in the majority of individual runs of the program. That means that the potential (proportional potential) for many tasks would tend toward zero in a theoretical summation of the applications of the program. And since you have to consider only terminating programs (because the possibility of infinite entropy in a non-terminating run may not be reduced by iteration) that means the halting problem is presumably relevant. And since this kind of discussion takes place amongst people who think about probability and statistics-based computation, a presumption of the practicality of sampling of the possible cases would make most generalizations about the entropy of a computer program untenable. So I believe that there are good reasons to be dubious about the general claim that the entropy of a computer program would be reduced with each overwrite of a bit. We are all aware that the logical equivalence arguments between the states of programs running on different types of computers are not valid for time differences. So I am wondering if the same kind of logical equivalence arguments can be ruled invalid because of relational functions that can occur in parallel branching implementations. I think the implicit timing issues are enough to make that kind of argument irrelevant but I think the potential relations of data between branches might also rule out the notion that an equivalence argument can be made on the summation of each possible individual branch. Here is a simple case using the opinion that the knowledge of the previous states of the bits of a program would mitigate the presumption that each step causes a reduction in entropy. Suppose an external device was able to report the previous states of a program back to the program. Then the entropy of the program would not decrease in spite of any presumption that it might. Jim Bromer On Sun, Dec 21, 2014 at 5:45 PM, John Rose via AGI <[email protected]> wrote: > BTW to interject: > > I noticed here, this guy - Downarowicz - has a pretty good grasp on entropy, > not sure who sent me this link for the paper but it's quite rewarding: > > For Entropy definitions: > http://prac.im.pwr.wroc.pl/~downar/english/documents/entropy.pdf > > and > > this is the good one: > http://arxiv.org/abs/1110.5201 > > John > >> -----Original Message----- >> From: Jim Bromer via AGI [mailto:[email protected]] >> Sent: Sunday, December 21, 2014 2:52 PM >> To: AGI >> Subject: Re: [agi] Re: [opencog-dev] Re: Probabilistic analysis of causality >> >> On Fri, Dec 19, 2014 at 4:42 PM, Matt Mahoney via AGI <[email protected]> >> wrote: >> > I realize there are some programs that are reversible, for example {A >> > -> B, B -> A}. In this case, entropy remains constant. >> >> Significantly, the 'entropy' of a computer (program) is relative to some >> 'frames of reference', like whether the previous state of a variable is >> retrievable. >> >> > A computation taking input is equivalent to a computation whose input >> > is part of its initial state. >> >> But again, the 'entropy' is relevant to some 'frames of reference' so your >> last >> statement must be an over-generalization. This isn't a quibble. (Or maybe, >> from another angle, it is a quibble, but it is a good one.) You are using a >> defeasible notion of logical equivalence to produce a conclusion about >> 'entropy'! It is like a perfect storm of an inappropriate use of logical >> equivalence. Even if a program that reacts to input is equivalent to a >> program >> with that input as part of an initial state, how could it be true that the >> entropy >> of the programs are equivalent? You can't dismiss the potential for the >> entropy of input as a mere representation of a particular path reproduced in >> a closed program. >> >> But even ignoring the question of whether the frame of reference of looking >> for sources of 'entropy' might override some equivalence argument, what >> about the question of a closed computation that looks for solutions to >> equations where some solution searches might produce chaotic evaluations. >> You cannot say that the entropy of the solution is decreased by an excursion >> into chaos because there may be more such excursions. For a strong >> example, if the 'equation' might itself be chaotic then the search for a >> solution to some 'equations' may not produce reductions in 'entropy' via any >> iteration. >> >> Extrapolations of a concept like 'entropy' is ok as long as you make a real >> effort to examine the limitations of such extrapolations realistically. >> Jim Bromer >> >> >> On Fri, Dec 19, 2014 at 4:42 PM, Matt Mahoney via AGI <[email protected]> >> wrote: >> > On Fri, Dec 19, 2014 at 2:13 PM, Jim Bromer <[email protected]> >> wrote: >> >> On Thu, Dec 18, 2014 at 9:52 PM, Matt Mahoney via AGI >> <[email protected]> wrote: >> >>> On Wed, Dec 17, 2014 at 1:27 PM, Abram Demski via AGI >> <[email protected]> wrote: >> >>>> My current understanding of time is "the direction of computation". >> >>> >> >>> That is actually quite precise. The entropy of a computer can only >> >>> decrease. In a state transition diagram, states can merge but not >> >>> fork. Operations like writing a bit of memory cannot be reversed >> >>> because the previous bit was erased. >> >> >> >> That reasoning is confused. The reference to a state transition >> >> diagram is not directly related to the discussion up until that >> >> point, >> > >> > Suppose a finite state machine has 2 possible states, A and B. The >> > program is {A -> B, B -> B}. The initial state can either be A or B. >> > The next state will be B. The entropy of the computer (the number of >> > bits you need to describe the state) has gone from 1 to 0. >> > >> > If the computer is in state B, you do not know the previous state. You >> > cannot run the program backward. >> > >> > The same argument applies to Turing machines. Writing a symbol to the >> > tape is not reversible in general because the previous symbol is >> > erased. >> > >> > I realize there are some programs that are reversible, for example {A >> > -> B, B -> A}. In this case, entropy remains constant. >> > >> >> and I don't believe anyone was talking about a computer program that >> >> could not respond to input. >> > >> > A computation taking input is equivalent to a computation whose input >> > is part of its initial state. >> > >> > -- >> > -- Matt Mahoney, [email protected] >> > >> > >> > ------------------------------------------- >> > AGI >> > Archives: https://www.listbox.com/member/archive/303/=now >> > RSS Feed: >> > https://www.listbox.com/member/archive/rss/303/24379807-653794b5 >> > Modify Your Subscription: https://www.listbox.com/member/?& Powered >> by >> > Listbox: http://www.listbox.com >> >> >> ------------------------------------------- >> AGI >> Archives: https://www.listbox.com/member/archive/303/=now >> RSS Feed: https://www.listbox.com/member/archive/rss/303/248029- >> 82d9122f >> Modify Your Subscription: >> https://www.listbox.com/member/?& >> -35e0de32 >> Powered by Listbox: http://www.listbox.com > > > > > ------------------------------------------- > AGI > Archives: https://www.listbox.com/member/archive/303/=now > RSS Feed: https://www.listbox.com/member/archive/rss/303/24379807-653794b5 > Modify Your Subscription: https://www.listbox.com/member/?& > Powered by Listbox: http://www.listbox.com ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
