begin  quoting David Brown as of Fri, Jan 25, 2008 at 11:09:05AM -0800:
> On Fri, Jan 25, 2008 at 10:43:59AM -0800, SJS wrote:
> 
> >I think you're going to have a problem because you don't constrain what
> >"severely bounded" means. We're no longer talking turing machines; if we
> >bound our machine to run only primitive recursive programs, we're can
> >make confident judgements about the machine halting in fixed time.
> 
> Ok.  Let's make our only bound memory.  To simplify, let's have the program
> memory be separate, but finite.  The PC needs to be represented in our
> memory though (so basically the memory includes CPU registers).
> 
> There are no constraints on what the architecture is.

Better. :)

> The goal isn't to show it is possible to make a computer simple enough to
> solve halting, but to show that limiting memory still is impractical to
> solve halting.

Yes. I'm just trying to clarify the scope a bit.

> >>  1.  I define impractical as where the solution to any non-trivial input
> >>  size requires more storage than the size of the universe or more time
> >>  than the duration of the known universe.
> >
> >Duration is meaningless without a performance limit. More bits than the
> >current estimated size of the universe is a better yardstick, but even
> >then you're making assumptions as to how the target scheme will run.
> 
> Let me clarify.  Assume a clock tick of the planck time (~ 10e-44 s).
> Assuming 13.8e9 years for the age, there are about 4e61 possible clock
> ticks in the universe.  It is not "practical" to build a computer whose
> clock can run faster than the planck time.  I don't care if your computer
> is slower, since it only makes things worse.

We're limiting ourselves to von neumann architecture for our machines, I
presume?  (That seems like a fair assumption, given that we're talking
about practical computability theory.)

> >>  2.  Given a processing system with 'n' bits of memory.  Each processing
> >>  step modifies zero or more of these bits.  Halting is defined as a
> >>  processing step that modifies zero bits.  The processing system has no
> >>  other state.
> >
> >Don't we also need to define input?  I would think that our bounded machine
> >could be described with a DFA operating on a symbol set of size 1 and
> >an unbounded input string (the clock). We need to keep the input separate
> >so that our program is general, yes?
> 
> Any input is part of memory, that's the point of bounding it.

Okay, no I/O with the universe, all data is present at the start of
the program.

Or tested program/machine is bounded; is our testing program/machine?

> >>  3.  Because there are a finite number of states, it is sufficient to
> >>  detect a repeated state to detect a non-halt.  I know of two ways of
> >>  detecting loops (but I don't know how to prove there aren't more):
> >>
> >>    a.  Track all visited states.  There are 2^(2^n) possible states.  
> >>    This requires impractical storage for all non-trivial n.
> >
> >There may be clever ways around that. 
> 
> Not if you don't know anything about what possible states and transitions
> there are.

Just because I'm not clever enough to devise a way around it doesn't
mean that it can't be done.

> >>    b.  Run two identical systems from the same starting system, but run
> >>    one two steps for each step of the first.  If they ever reach the same
> >>    state, there is a cycle, and the system will not halt.  This requires
> >>    2*2^n storage.  But, the longest possible cycle would go through every
> >>    state, requiring 2^(2^n) steps.  This requires impractical time.
> >
> >Take the trivial machine that halts.  The trailing machine will have the
> >same bit-pattern as the leading machine, and flag a cycle, despite no
> >state changing ever.
> 
> The proof is not that it is possible for you to construct a machine that
> can detect some halts, but that it is impossible to construct a machine
> that detects all halts.

And there's the degenerate case for approach (b). It needs to be
reformulated, or discounted entirely, no proof by induction needed as
the counter-example is by construction. :)

> >>                                                     It also necessary to
> >>prove that the repeat detection of 3 is necessary instead of merely
> >>sufficient.
> >
> >I can see why that would help, but why would that be true?
> 
> Without it, I've not proven what I intended to prove.  Otherwise, we don't
> know if there isn't some yet-undiscovered algorithm out there that solves
> the problem more efficiently.  If we can prove that all algorithms must use
> either O(2^2^n) space or time, the conclusion follows.

Yes, but "it has to be true for my argument to work" doesn't give me a
warm fuzzy about the assertion being true. 

Plus, doesn't sufficient trump necessary?

-- 
We will eventually test against the computer of Satan
In theory, of course, as the real one is out baitin'.
Stewart Stremler

-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-lpsg

Reply via email to