Re: Quantum Probability and Decision Theory

2002-12-30 Thread Stephen Paul King
Dear Bruno,

Interleaving.

- Original Message -
From: Marchal Bruno [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, December 30, 2002 8:26 AM
Subject: Re: Quantum Probability and Decision Theory


 Stephen Paul King wrote:

 There do exist strong arguments that the macroscopic state of
neurons
 is not completely classical and thus some degree of QM entanglement is
 involved. But hand waving arguments aside, I would really like to
understand
 how you and Bruno (and others), given the proof and explanations
contained
 in these above mentioned papers and others, maintain the idea that any
 quantum computer or physical system can be simulated by a classical
 computer.

 I agree with almost all the quotations you gave in your post
 http://www.escribe.com/science/theory/m4254.html
 But they are not relevant for our issue.

[SPK]

When what is?

 Quantum computer can be emulated by classical computer (see below).

[SPK]

Please point me to some paper, other than yours, that gives something
better than a hand-waving argument of this statement.

 Quantum computer does not violate Church thesis. The set of
 quantum computable functions is the same as the set of classically
 computable functions.

[SPK]

This is one statement that I found:

http://www.netaxs.com/people/nerp/automata/church8.html
Church's thesis. All the models of computation yet developed, and all those
that may be developed in the future, are equivalent in power. We will not
ever find a more powerful model.


Statements of this kind are found in religious dogma and are not
acceptable in mathematical and logical circles unless they have adjoining
caveats and attempted proofs. No where do I find this. It reminds me of
Kronecker's statement God made the integers; all else is the work of man.

I do not intend to be impolite here, Bruno, but, Please, give me some
reference that discusses how it is that The set of
quantum computable functions is the same as the set of classically
computable functions.

I can not see how this is possible given that (as Svozil et at state in
http://tph.tuwien.ac.at/~svozil/publ/embed.htm )
for quantum systems representable by Hilbert spaces of dimension higher
than two, there does not exist any such valuation s: L® {0,1} ... there
exist different quantum propositions which cannot be distinguished by any
classical truth assignment.

If there does not exist a unique Boolean valuation for QM systems that
can be taken to exist a priori and given that UTMs, including UD, demand the
existence of such for their definition, how is it that you can continue to
make this statement?


 The difference are the following points 1) and 2):
 1) A classical computer cannot emulate a quantum computer in real time,
 nor 2) can a classical computer provide a pure 3-person emulation of
 some quantum *processes* like the generation of truly random
 numbers.

 But this is not relevant concerning our fundamental issue.
 Concerning 1) the only thing which matters is that the classical UD
 runs all programs including quantum one.

[SPK]

Can UD run them all simultaneously or in some concurrent way that would
give us some way of finding solutions to the foliation of space-like
hypersurfaces problem of GR? Also, what are the physical resource
requirements of UD? What consitutes it's memory and its read/write head?

 THEN, by UDA reasoning, it is
 shown that real time is a 1-person (plural) emerging notion. Even
 if the UD need many googol-years to compute each quantum step, from the
 inside 1-person point of view, that delays are not observable. CF the
 invariance lemma in UDA.

[SPK]

I like this part of your thesis and my interest in it makes me ask these
very pointed questions. Additionally, I hope you would agree that there may
be an additional problem of how the Principle of Least action of physics
come about. Could it be considered, within your thesis, as just another
1-person aspect?


 Concerning 2): idem! We cannot generate truly random sequences, but we
 can easily (with the comp hyp!!!) generate histories in which
 most average observers will correctly believe in truly random sequences.

[SPK]

Appeals to polls and beliefs should not be made. It would be more
helpfull if we considered such things as G. Chaitin's Omega when discussing
random sequences. Can UD compute Omega?

http://www.cs.auckland.ac.nz/CDMTCS/chaitin/#PL


 It is enough for that purpose to iterate the Washington-Moscow
 self-duplication experiment. If you iterate this 64 times, most of
 the 1.85 10^19 version of you will conclude (correctly with comp) that
 they are unable to predict their next self-output (W or M) and that their
 histories, in this context are truly random.

 Now, the simple reason why the quantum is turing-emulable is that the
 solutions of Schroedinger or Dirac Wave Equation(s) are computable.

[SPK]

Where is this explained? Please, some papers that you did not write...

 If you simulate 

Re: Quantum Probability and Decision Theory

2002-12-30 Thread Stephen Paul King
Dear Jesse,

Please read the below referenced paper. It shows that QM comp *CAN* 
solve an undecidable problem
 (relative to a classical computer). I do not see how I misread Feynman's
claim, but I do admit that the quotation what I reproduced is insufficient
to support my claim. It is not an issue of speed that I am trying to point
out, it is an issue of computational expressiveness.
Unless I am mistaken, UTMs of all kinds operate in the space of
Integers, QM comps operate, at least, in the space of the Reals. ...

Kindest regards,

Stephen


- Original Message -
From: Jesse Mazer [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, December 30, 2002 11:01 AM
Subject: Re: Quantum Probability and Decision Theory


snip


http://www.cs.auckland.ac.nz/~cristian/coinsQIP.pdf
 
 
 **
 1. INTRODUCTION
 
 For over fifty years the Turing machine model of computation has defined
 what it means to ''compute'' something; the foundations of the modern
 theory of computing are based on it. Computers are reading text,
 recognizing
 speech, and robots are driving themselves across Mars. Yet this
 exponential race will not produce solutions to many intractable and
 undecidable
 problems. Is there any alternative? Indeed, quantum computing
 offers one such alternative (see Ref. 7, 10, 11, 23, 35). To date,
quantum
 computing has been very successful in ''beating'' Turing machines in the
 race of solving intractable problems, with Shor and Grover algorithms
 achieving the most impressive successes; the progress in quantum hardware
 is also impressive. Is there any hope for quantum computing to challenge
 the
 Turing barrier, i.e., to solve an undecidable problem, to compute an
 uncomputable function? According to Feynman's argument (see Ref. 20, a
 paper reproduced also in Ref. 25, regarding the possibility of simulating
a
 quantum system on a (probabilistic) Turing machine) the answer is
 negative.

 This seems to say the opposite of what you are arguing. Look at the last
two
 sentences--they ask if quantum computers can solve an undecidable
problem
 (relative to a classical computer) or compute an uncomputable function,
 but according to Feynman the answer is negative--ie a quantum computer
can
 *not* solve any classically-unsolvable problems. I take it you interpreted
 Feynman's negative answer to refer to the possibility of simulating a
 quantum system on a (probabilistic) Turing machine, but because of the
way
 the sentences are constructed I think you misread it. I suspect that
 Feynman's argument involved showing that you *can* simulate a quantum
 system on a probabilistic Turing machine, and that he used that to prove
 that quantum computers cannot do anything fundamentally new, even if they
 may in some cases work much faster.






Re: Quantum Probability and Decision Theory

2002-12-30 Thread Jesse Mazer
Stephen Paul King wrote:



Dear Jesse,

Please read the below referenced paper. It shows that QM comp *CAN* 
solve an undecidable problem
 (relative to a classical computer).


Where does it say that?


I do not see how I misread Feynman's
claim


Again, the paper says:

Is there any hope for quantum computing to challenge the Turing barrier, 
i.e., to solve an undecidable problem, to compute an uncomputable function? 
According to Feynman's argument ... the answer is negative.

That seems pretty clear to me--if the answer is negative, that means there 
is *not* any hope for quantum computing to challenge the Turing barrier. 
Do you understand negative to mean something different?

Jesse

_
MSN 8 limited-time offer: Join now and get 3 months FREE*. 
http://join.msn.com/?page=dept/dialupxAPID=42PS=47575PI=7324DI=7474SU= 
http://www.hotmail.msn.com/cgi-bin/getmsgHL=1216hotmailtaglines_newmsn8ishere_3mf



Re: Quantum Probability and Decision Theory

2002-12-30 Thread Joao Leao
There are two sides to this question that may be clouding the
argument and maybe suggest a change in thread. Here go my 2 cents:

1) Yes, indeed there is no hope that a Quantum Computer _as we
understand it today_  (and I underscore this last point) is likely to
violate the Turing's Halting Theorem for example and that is
a stronger statement than any reference to the Church-Turing
thesis which, after all, is only an heuristic presumption...

2) But it is also a fact that Quantum Computers, in the Benioff-Feynman-
Deutsch model are constructed having in mind the Turing Machine model
of computation and that may not be the most general one conceivable. Even
before QC people have proposed other models which would not be bound
by the limitations of the Turing Model (see Boolos  Jeffrey Computability
and Logic for an example of one such avatar called the Zeus machine.
Granted these are not physical models but neither is the Turing Machine!
All physical computers are Finite Automata of some sort, as everyone knows...

Now the 69 cent question is: can Quantu Mechanics provide us with a
trans-Turing model of computation with some hope of physical realization?

Now, I am only paying my 2 cents of wisdom so don't count on my answering
this one

Cordially,

-Joao Leao

P.S. - Happy New Year Everybody on Everything...


Jesse Mazer wrote:

 Stephen Paul King wrote:

 Also, any quantum computer or physical system can be simulated by a
 classical computer.
 
 [SPK]
 
 Bruno has made similar statements and I do not understand how this is
 true. I have it from multiple sources that this is not true. Do you recall
 the famous statement by Richard Feynman regarding the exponential
 slowdown
 of classical system attempting to simulate QM systems?

 It is true that classical computers will take an exponentially long time to
 simulate quantum ones, but as far as I know it's still true that there are
 no problems that are solvable by a quantum computer that cannot also be
 solved by a classical computer (possibly much more slowly).

 Let me quote from a
 paper by Karl Svozil et al:  http://tph.tuwien.ac.at/~svozil/publ/embed.htm
 
 
 I don't see how this supports your argument either.

 --Jesse

 _
 Add photos to your e-mail with MSN 8. Get 3 months FREE*.
 
http://join.msn.com/?page=features/featuredemailxAPID=42PS=47575PI=7324DI=7474SU=
 http://www.hotmail.msn.com/cgi-bin/getmsgHL=1216hotmailtaglines_addphotos_3mf

--

Joao Pedro Leao  :::  [EMAIL PROTECTED]
Harvard-Smithsonian Center for Astrophysics
1815 Massachussetts Av. , Cambridge MA 02140
Work Phone: (617)-496-7990 extension 124
VoIP Phone: (617)=384-6679
Cell-Phone: (617)-817-1800
--
All generalizations are abusive (specially this one!)
---






Re: Quantum Probability and Decision Theory

2002-12-30 Thread Hal Finney
Stephen Paul King references:

 http://www.cs.auckland.ac.nz/~cristian/coinsQIP.pdf

whose abstract begins,

Is there any hope for quantum computer to challenge the Turing barrier,
i.e., to solve an undecidable problem, to compute an uncomputable
function?  According to Feynman's '82 argument, the answer is negative.
This paper re-opens the case: we will discuss solutions to a few simple
problems which suggest that quantum computing is theoretically capable
of computing uncomputable functions.


We discussed this article briefly in July.  Calude relies on infinite
dimensional Hilbert space with the inputs prepared in an infinite
superposition.  Without claiming to understand all the details, this
looks to me like it would require an infinite amount of work to prepare.

It is well known that under idealized classical Newtonian physics,
computations are possible that break the Turing barrier if you have
infinite precision in your inputs.  Of course, we don't live in a
Newtonian universe.  This new result has something of the same flavor,
applied to a quantum mechanical universe.  It still relies on infinities.

When a finite quantum computer can break the Turing barrier, that will
prove something.  But when your first step is to prepare an infinite
superposition, that has no applicability to the physical universe.

Hal Finney




RE: Quantum Probability and Decision Theory

2002-12-30 Thread Ben Goertzel


 When a finite quantum computer can break the Turing barrier, that will
 prove something.  But when your first step is to prepare an infinite
 superposition, that has no applicability to the physical universe.

 Hal Finney


Precisely.  Deutsch's arguments make a lot of assumptions about things being
finitely given; Calude's theory makes very different assumptions.  If you
take Calude's assumptions and replace them with finite-precision
assumptions, the non-Turing stuff goes away.

Less formally: you need to put noncomputable information into QM to get
noncomputable information out of QM.  If you don't explicitly put
noncomputable information into it, you won't get any out.

ben




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Stephen Paul King
Dear Jesse,


- Original Message -
From: Jesse Mazer [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, December 30, 2002 11:40 AM
Subject: Re: Quantum Probability and Decision Theory


 Stephen Paul King wrote:

 
 Dear Jesse,
 
  Please read the below referenced paper. It shows that QM comp *CAN*

 solve an undecidable problem
   (relative to a classical computer).

 Where does it say that?


[SPK]

In the abstract of http://www.cs.auckland.ac.nz/~cristian/coinsQIP.pdf

Please read this paper. It explains the basis of my claim.

Kindest regards,

Stephen

 I do not see how I misread Feynman's
 claim

 Again, the paper says:

 Is there any hope for quantum computing to challenge the Turing barrier,
 i.e., to solve an undecidable problem, to compute an uncomputable
function?
 According to Feynman's argument ... the answer is negative.

 That seems pretty clear to me--if the answer is negative, that means there
 is *not* any hope for quantum computing to challenge the Turing barrier.
 Do you understand negative to mean something different?


[SPK]

Perhaps we should consider that our dear Mr. Feynman did not fully
appreciate his own idea. ;-)

Kindest regards,

Stephen





Computing with reals instead of integers

2002-12-30 Thread Tim May

On Monday, December 30, 2002, at 09:38  AM, Hal Finney wrote:


We discussed this article briefly in July.  Calude relies on infinite
dimensional Hilbert space with the inputs prepared in an infinite
superposition.  Without claiming to understand all the details, this
looks to me like it would require an infinite amount of work to 
prepare.

It is well known that under idealized classical Newtonian physics,
computations are possible that break the Turing barrier if you have
infinite precision in your inputs.  Of course, we don't live in a
Newtonian universe.  This new result has something of the same flavor,
applied to a quantum mechanical universe.  It still relies on 
infinities.

When a finite quantum computer can break the Turing barrier, that will
prove something.  But when your first step is to prepare an infinite
superposition, that has no applicability to the physical universe.


Along these lines, the noted Berkeley mathematician Steve Smale has 
worked with others (cryptographers Blum, Shub) on what it would mean to 
have a computer work with _real_ instead of integers. This seems close 
in spirit to much of both what Hal refers to above and to the more 
general quantum computation systems (*).

 (* where machines must be set up with very high precision to compute, 
via Shor's or Grover's algorithms, for example, even small 
results...and where large results will require precisions of many, many 
decimal places. There is hope that Zurek-type quantum error correction 
coding (QECC)  can fix this scenario where materials have to be 
positioned to, say, 50 decimal places of accuracy to compute some level 
of computation (epsilon-delta argument, of course), but some recent 
reports cast doubt on the rising energy requirements of QECC. But I am 
not (yet) as knowledgeable about quantum computing as I hope to be in 
several months, so go to the source materials if interested.)

Here is one of many readily-findable references to the work:

http://citeseer.nj.nec.com/context/5445/0

An important example are computations over the real numbers based on 
the model of Blum Shub Smale. Computation over . In 1989 Blum, Shub, 
and Smale [14] introduced a model for computations over the real 
numbers (and other rings as well) which is now usually called a BSS 
machine. The important difference with, say, the Turing model is that 
real numbers are treated as basic entities and that arithmetic 
operations on the reals are performed in a  

Here's a review article by John Casti:

http://www.heise.de/tp/english/kolumnen/cas/2414/1.html

(Note that he calls the machine a BBS machine, getting it wrong. He was 
probably confusing it with the BBS (Blum-Blum-Shub) random number 
generator, which is not at all the same thing as a BSS mahine. But 
Casti writes engagingly, so this is a good introduction.)



In a hand-waving sort of way, computing with real numbers cuts through 
a lot of the work to build arbitrary-precision (or length) strings of 
symbols.

But are the reals real?

A fascinating topic, perhaps too much of a digression here. Read on if 
interested.

A real number is a point on a line. Its expansion in any base is of 
course infinite. Does such a thing really exist?

David Lewis would argue that yes, lines and points and real numbers 
really do exist, but that they are inaccessible to us (in the way that 
the worlds where Germany won the second world war really exist, but 
cannot be visited or communicated with). Lewis uses a closeness metric, 
where he shows how we may approach the world of the perfect Euclidean 
line or point or space. Take any drawn line, make the pencil line 
thinner and thinner, straighten out the kinks, etc.

In the possible worlds ontology, real numbers and Euclidean lines exist 
as possible worlds, but nothing in our experience or in what we 
understand of the working of the world we are in lets us reach these 
possible worlds.

To the constructivist, or intuitionist (a la Brouwer, Heyting), this 
says that a real number is only the name we give to a process of 
construction (e.g., Dedekind cuts, the familiar epsilon-delta 
construction of the real numbers). The real number is not an actual 
thing, but a process, an algorithm. (As someone quoted within the past 
day, Kronecker's God made the integers, all else is the work of man.)

To cut to the chase, so to speak, the reals don't actually exist in our 
world except as ideals (other possible worlds we can approach but never 
reach).

An attempt to build a machine computing with the reals would be 
interesting, if only to show how buried in the attempt would be 
algorithms and constructions which are simulating real numbers to some 
precision. (Perhaps this is where quantum computers break down, if they 
do.)

But the BSS work sheds a lot of interesting light on computability, 
further illuminating the fascinating links between the theory of 
recursive functions (aka lambda calculus), cartesian closed categories, 
and the 

Re: Quantum Probability and Decision Theory

2002-12-30 Thread Stephen Paul King
Dear Hal,

You make a very good point! Thank you for actually reading Calude et
al's paper. ;-) But, I have one nagging question: is the requirement of the
existence of an infinite superposition more or less hard to swallow than
Bruno's arithmetical truth?
 I don't see how arithmetical truth is a simpler assumption than the
existence of an infinite superposition (in Hilbert space?). It seems to me
that the former is a subset of the latter, not the other way around. Again,
I go back to Cantor's proof showing that the integers and the Reals have
distinct cardinalities and the simple question of what breaths fire into
the equations? (or Bruno's UD)?

QM comp seems to operate in the space of the Reals (R) and TM operates
in the space of Integers (Z), is this correct?

We must additionally account for, at least, the illusion of time and
concurrency of events.

Kindest regards,

Stephen

- Original Message -
From: Hal Finney [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, December 30, 2002 12:38 PM
Subject: Re: Quantum Probability and Decision Theory


 Stephen Paul King references:

  http://www.cs.auckland.ac.nz/~cristian/coinsQIP.pdf

 whose abstract begins,

 Is there any hope for quantum computer to challenge the Turing barrier,
 i.e., to solve an undecidable problem, to compute an uncomputable
 function?  According to Feynman's '82 argument, the answer is negative.
 This paper re-opens the case: we will discuss solutions to a few simple
 problems which suggest that quantum computing is theoretically capable
 of computing uncomputable functions.


 We discussed this article briefly in July.  Calude relies on infinite
 dimensional Hilbert space with the inputs prepared in an infinite
 superposition.  Without claiming to understand all the details, this
 looks to me like it would require an infinite amount of work to prepare.

 It is well known that under idealized classical Newtonian physics,
 computations are possible that break the Turing barrier if you have
 infinite precision in your inputs.  Of course, we don't live in a
 Newtonian universe.  This new result has something of the same flavor,
 applied to a quantum mechanical universe.  It still relies on infinities.

 When a finite quantum computer can break the Turing barrier, that will
 prove something.  But when your first step is to prepare an infinite
 superposition, that has no applicability to the physical universe.

 Hal Finney







Re: Quantum Probability and Decision Theory

2002-12-30 Thread Joao Leao
There go 7 cents out of my 60!...

The case indeed is that if you build a quantum computer by emulating
a Turing-Universal Machine you are a priori circunscribing its own
class of algorithms. That is only natural if that is the largest class of
computable problems you think are worthwhile considering. But it
isn't necessarily the only one. This approach surfaces here and there
in the literature. See for example:

http://arXiv.org/abs/quant-ph/0205093

Another point worth making is that it seems unlikely that the recourse
to the infinite superposability of quantum states is going to be of any
help in this circunstance. It may be more profitable to look to
entanglement (which incidentaly is the trully novelty that QC brings
to the realm of computation) as the road to a trans-Turing class of
computations.

As to your reference to Penrose, Ben, I should probably add that
his much maligned ideas concerning the possibility of using Quantum
Gravity as a basis for understanding the psychology of mathematical
invention are perhaps worth a second look now that we are learning a
good deal more about quantum information in Black Holes etc...

-Joao Leao


Ben Goertzel wrote:

  When a finite quantum computer can break the Turing barrier, that will
  prove something.  But when your first step is to prepare an infinite
  superposition, that has no applicability to the physical universe.
 
  Hal Finney
 

 Precisely.  Deutsch's arguments make a lot of assumptions about things being
 finitely given; Calude's theory makes very different assumptions.  If you
 take Calude's assumptions and replace them with finite-precision
 assumptions, the non-Turing stuff goes away.

 Less formally: you need to put noncomputable information into QM to get
 noncomputable information out of QM.  If you don't explicitly put
 noncomputable information into it, you won't get any out.

 ben

--

Joao Pedro Leao  :::  [EMAIL PROTECTED]
Harvard-Smithsonian Center for Astrophysics
1815 Massachussetts Av. , Cambridge MA 02140
Work Phone: (617)-496-7990 extension 124
VoIP Phone: (617)=384-6679
Cell-Phone: (617)-817-1800
--
All generalizations are abusive (specially this one!)
---






Re: Quantum Probability and Decision Theory

2002-12-30 Thread Jesse Mazer
Stephen Paul King wrote:


Dear Jesse,

 Please read the below referenced paper. It shows that QM comp *CAN*



solve an undecidable problem
  (relative to a classical computer).

Where does it say that?



[SPK]

   In the abstract of http://www.cs.auckland.ac.nz/~cristian/coinsQIP.pdf


Please read this paper. It explains the basis of my claim.


Apologies, I hadn't even read the entire abstract so I didn't notice that 
they went on to argue that Feynman's conclusion was wrong. Still, I think 
that their proposal is something different from the usual notion of a 
quantum computer, since I was under the impression that people like 
Feynman and Deutsch had already shown definitively that a quantum computer 
cannot solve any classically-unsolvable problems. I'll leave it to others 
who understand this paper better than I to discuss the merits of this new 
proposal.

Jesse


_
MSN 8: advanced junk mail protection and 3 months FREE*. 
http://join.msn.com/?page=features/junkmailxAPID=42PS=47575PI=7324DI=7474SU= 
http://www.hotmail.msn.com/cgi-bin/getmsgHL=1216hotmailtaglines_advancedjmf_3mf



RE: Quantum Probability and Decision Theory

2002-12-30 Thread Jesse Mazer
Ben Goertzel wrote:



Jesse  Stephen:

About quantum computing getting around the limitations of Turing machines:
you don't have to cite Feynman, this matter was settled fairly clearly in
David Deutsch's classic work on quantum computation.  He showed that the
only quantum-computable functions are Turing-computable functions.  Quantum
computers may be able to compute some things *faster* than classical
computers (in the average case), but this is a different story.

In his book Shadows of the Mind, Penrose reacts to this result with
disappointment, and with an expression of hope that quantum gravity
computers will be able to compute non-Turing-computable functions.  But so
far neither he nor anyone else has offered much more than hope and
speculation in this direction.  My own opinion is that this is probably a
pipe dream, and we must make do with Turing-computable functions, but I'm 
of
course open to new ideas and new information...

I had a science-fictional idea about a way to build an oracle machine after 
reading Hans Moravec's article on Time Travel and Computing here:

http://www.frc.ri.cmu.edu/users/hpm/project.archive/general.articles/1991/TempComp.html

As I understood it, the basic idea here was to use the fact that history 
must work out consistently to get a machine that could solve problems much 
faster than a Turing machine. For example, for any problem that requires 
exponential time to reach a solution but for which possible solutions can be 
checked in polynomial time, you could have the machine pick a possible 
solution at random, then check to see if the solution actually works, then 
if it *doesn't* work it sends back a sort of override command that changes 
the original guess, which would create an inconsistency. The only consistent 
history in this case would be one where the machine's first guess happened 
to be correct, so if history is indeed constrained to be consistent, you are 
guaranteed to get the correct answer on the first guess.

Moravec says that although this would allow you to solve certain problems 
faster than a classical computer or a traditional quantum computer, it would 
not actually allow you to solve any uncomputable problems. But suppose we 
combine the idea of time-travel computers with the idea that intelligent 
life will find a way to increase the total amount of computing power 
available without bound as time goes to infinity (as in Freeman Dyson's 
proposal) or as time approaches the moment of the big crunch (as in Frank 
Tipler's proposal). Then it seems you could use such a machine to solve the 
halting problem--the machine could originally guess randomly whether to 
display the output does not halt or halts after n steps (where n is any 
whole number), and then you could run the computation indefinitely, and if 
it ever does halt and the number of steps is different from the original 
guess (or if it passes the number of steps in the original guess without 
halting), the machine is programmed to send back a message which overrides 
the original guess, which would create a contradiction. Thus, for history to 
work out consistently, the original guess must have been the correct one.

It seems to me that one could use this idea of time-travel-computers in a 
universe where computing power increases without bound not just to create 
first-level oracle machines which tell you if the nth Turing machine halts 
or not, but also arbitrary higher-level oracle machines which tell you if 
the nth m-level oracle machine halts or not, where m is any countable 
ordinal (Penrose discusses the notion of higher-level oracle machines in 
'Shadows of the Mind'). I'm not sure if there are any conceivable types of 
machines consisting of a finite set of well-defined operations that could 
not be simulated in such a universe.

Jesse

_
Add photos to your e-mail with MSN 8. Get 3 months FREE*. 
http://join.msn.com/?page=features/featuredemailxAPID=42PS=47575PI=7324DI=7474SU= 
http://www.hotmail.msn.com/cgi-bin/getmsgHL=1216hotmailtaglines_addphotos_3mf



RE: Quantum Probability and Decision Theory

2002-12-30 Thread Hal Finney
Jesse Mazer writes:
 I had a science-fictional idea about a way to build an oracle machine after 
 reading Hans Moravec's article on Time Travel and Computing here:

 
http://www.frc.ri.cmu.edu/users/hpm/project.archive/general.articles/1991/TempComp.html

 As I understood it, the basic idea here was to use the fact that history 
 must work out consistently to get a machine that could solve problems much 
 faster than a Turing machine. For example, for any problem that requires 
 exponential time to reach a solution but for which possible solutions can be 
 checked in polynomial time, you could have the machine pick a possible 
 solution at random, then check to see if the solution actually works, then 
 if it *doesn't* work it sends back a sort of override command that changes 
 the original guess, which would create an inconsistency. The only consistent 
 history in this case would be one where the machine's first guess happened 
 to be correct, so if history is indeed constrained to be consistent, you are 
 guaranteed to get the correct answer on the first guess.

One correction, there are no known problems which take exponential time
but which can be checked in polynomial time.  If such a problem could be
found it would prove that P != NP, one of the greatest unsolved problems
in computability theory.

Since Moravec's machine uses time travel, or at least information transfer
into the past, it is obviously extremely speculative.  But given that,
I think your idea does make sense in theory, that if there were an
infinite amount of computing power possible in the future, and we sent
a signal into the past if a given program ever halted, then the absence
of such a signal would imply that the program did not halt.

Of course any kind of practical engineering that involves infinities
is even more speculative than time travel.  You have to assume that a
signal can be sent an infinite distance through time, that the future
calculation goes on forever and never ends, and so on.  It seems difficult
to imagine such a degree of consistency and reliability in the imperfect
universe where we live.

Hal Finney




Many Worlds and Oracles

2002-12-30 Thread Tim May

On Monday, December 30, 2002, at 11:57  AM, Jesse Mazer wrote:

As I understood it, the basic idea here was to use the fact that 
history must work out consistently to get a machine that could solve 
problems much faster than a Turing machine. For example, for any 
problem that requires exponential time to reach a solution but for 
which possible solutions can be checked in polynomial time, you could 
have the machine pick a possible solution at random, then check to see 
if the solution actually works, then if it *doesn't* work it sends 
back a sort of override command that changes the original guess, which 
would create an inconsistency.

Or just kills you and/or your world.

This idea predates Max Tegmark by quite a while...I give Moravec the 
credit.

My own version of an oracle was done for an article I sent out in 1994. 
Included below.


RSA Broken By The Russians?

Kolmogorov Cryptography System Possibly Cracked
1 Apr 1994


   MOSCOW (AP) -- At a press conference held minutes ago in a crowded 
hall,
   Russian mathematicians announced that a breakthrough had been made
   nearly a decade ago in the arcane branch of mathematics known as
   cryptography, the science of making messages that are unreadable to
   others.

   Leonid Vladwylski, Director of the prestigious Moscow Academy of
   Sciences, called the press conference yesterday, after rumors began
   circulating that noted Russian-American reporter John Markoff was in
   Russia to interview academicians at the previously secret city of
   Soviet cryptographers, Kryptogorodok.  The existence of 
Kryptogorodok,
   sister city to Akademogorodok, Magnetogorsk, and to the rocket cities
   of Kazhakstan, had been shrouded in secrecy since its establishment 
in
   1954 by Chief of Secret Police L. Beria.  Its first scientific
   director, A. Kolmogorov, developed in 1960 what is called in the West
   public key cryptography.  The existence of Kryptogorodok was 
unknown
   to the West until 1991, when Stephen Wolfram disclosed its existence.


   American cryptographers initially scoffed at the rumors that the
   Russians had developed public-key cryptography as early as 1960, some
   15 years prior to the first American discovery.  After interviews 
last
   year at Kryptogorodok, noted American cryptographers Professor D.
   Denning and D. Bowdark admitted that it did seem to be confirmed.

   Professor Denning was quoted at the time saying that she did not 
think
   this meant the Russians could actually break the Kolmogorov system,
   known in the West as RSA, because she had spent more than a full 
weekend
   trying to do this and had not succeeded.  Believe me, RSA is still
   unbreakable, she said in her evaluation report.

   Russia's top mathematicians set out to break Kolmogorov's new coding
   system.  This required them to determine that P = NP (see 
accompanying
   article).  Details are to be published next month in the journal
   Doklady.Krypto, but a few details are emerging.

   The Kolmogorov system is broken by computing the prime numbers which
   form what is called the modulus.  This is done by randomly guessing 
the
   constituent primes and then detonating all of the stockpiled nuclear
   weapons in the former Soviet Union for each wrong guess.  In the 
Many
   Worlds Interpretation of quantum mechanics, invented in 1949 by Lev
   Landau (and later, independently by Everett and Wheeler in the U.S.),
   all possible outcomes of a quantum experiment are realized.

   As Academician Leonid Vladwylski explained, In all the universes in
   which we guessed the wrong factors, we were destroyed completely.  
But
   since we are obviously here, talking to you at this press 
conference, in
   this universe we have an unbroken record of successfully factoring 
even
   the largest of imaginable numbers.  Since we are so optimistic about
   this method, we say the computation runs in 'Nondeterministic 
Pollyanna
   Time.'  Allow me to demonstrate...

   [Press Conference will be continued if the experiment is a success.]

   MOSCOW (AP), ITAR-Tass, 1 April 1994


Appendix

First, it was Stephen Wolfram's actual suggestion, a couple of years ago
after the USSR imploded, that we try to recruit mathematicians and
programmers from what he surmised must exist: a secret city of Soviet
cryptographers.  It probably exists.  We did it at Los Alamos, they did 
it
with their rocket scientists and others (Akademogorodok exists), so why 
not
put their version of NSA a bit off the beaten track?  Note that our own 
NSA
is within a stone's throw of the Baltimore-Washington Parkway.  I 
wouldn't
be surprised to learn that their experts were ensconced somewhere in the
Urals.

I tried to acknowledge Steve with my comments.  By the way, so far as I
know, no word has come out on whether he was right in this speculation.
(Maybe some of the Russians he does in fact have working at Wolfram are
these folks?  Naw...)

Second, Kolmogorov did basic work on 

RE: Quantum Probability and Decision Theory

2002-12-30 Thread Jesse Mazer
Hal Finney wrote:


One correction, there are no known problems which take exponential time
but which can be checked in polynomial time.  If such a problem could be
found it would prove that P != NP, one of the greatest unsolved problems
in computability theory.


Whoops, I've heard of the P=NP problem but I guess I was confused about what 
it meant. But there are some problems where candidate solutions can be 
checked much faster than new solutions can be generated, no? If you want to 
know whether a number can be factorized it's easy to check candidate 
factors, for example, although if the answer is that it cannot be factorized 
because the number is prime I guess there'd be no fast way to check if that 
answer is correct.


Since Moravec's machine uses time travel, or at least information transfer
into the past, it is obviously extremely speculative.  But given that,
I think your idea does make sense in theory, that if there were an
infinite amount of computing power possible in the future, and we sent
a signal into the past if a given program ever halted, then the absence
of such a signal would imply that the program did not halt.

Of course any kind of practical engineering that involves infinities
is even more speculative than time travel.  You have to assume that a
signal can be sent an infinite distance through time, that the future
calculation goes on forever and never ends, and so on.  It seems difficult
to imagine such a degree of consistency and reliability in the imperfect
universe where we live.


But whatever the degree of risk that the machine will malfunction over any 
set amount of time, it seems to me that one could always make that risk 
smaller and smaller over time by building more and more backup machines, so 
that in the limit the accumulated probability of a malfunction could still 
be kept small. I haven't thought this through too carefully though.

Anyway, however far-fetched the proposal is, if it could work even in 
principle it would be relevant to the question of whether quantum gravity 
allows uncomputable functions to be realized. General relativity does have 
solutions in which backwards time travel is possible, although I don't know 
how likely it is that this will remain true in a theory of quantum gravity 
(is it known if this is true in string theory?) But if it did, and if the 
theory also did not in principle forbid computing power to increase without 
limit (perhaps it would forbid it--for example, I'm not sure if Tipler's 
proposal would work if spacetime were quantized), then this would suggest 
the laws of physics are uncomputable, even if such a scheme is never 
actually realized in our universe.

This is also somewhat relevant to theories of everything since we might 
want to ask if somewhere in the set of all possible universes there exists 
one where time travel is possible and computing power increases without 
bound. If the answer is yes, that might suggest that any TOE based on all 
possible computations is too small to accomodate a really general notion of 
all possible universes. But it might be possible to arrange things so that 
one would experience living in such a universe from a first-person POV while 
your TOE is still based on some notion of all possible computations (as with 
the universal dovetailer)--one could start out computing different histories 
in which different possible messages from the future are recieved, and 
then continually spit out copies of the histories where no inconsistency has 
yet appeared while not doing the same for histories where an inconsistency 
does crop up, so that in the limit as time goes to infinity the first-person 
probability of finding oneself in an inconsistent history becomes 
vanishingly small compared to the probability of finding oneself in a 
consistent one.

Jesse

_
MSN 8 with e-mail virus protection service: 3 months FREE*. 
http://join.msn.com/?page=features/virusxAPID=42PS=47575PI=7324DI=7474SU= 
http://www.hotmail.msn.com/cgi-bin/getmsgHL=1216hotmailtaglines_eliminateviruses_3mf



Many Worlds Version of Fermi Paradox

2002-12-30 Thread Tim May
On Monday, December 30, 2002, at 01:18  PM, Jesse Mazer wrote:


Hal Finney wrote:


One correction, there are no known problems which take exponential 
time
but which can be checked in polynomial time.  If such a problem could 
be
found it would prove that P != NP, one of the greatest unsolved 
problems
in computability theory.

Whoops, I've heard of the P=NP problem but I guess I was confused 
about what it meant. But there are some problems where candidate 
solutions can be checked much faster than new solutions can be 
generated, no? If you want to know whether a number can be factorized 
it's easy to check candidate factors, for example, although if the 
answer is that it cannot be factorized because the number is prime I 
guess there'd be no fast way to check if that answer is correct.

Factoring is not known to be in NP (the so-called NP-complete class 
of problems...solve on in P time and you've solved them all!).

The example I favor is the Hamiltonian cycle/circuit problem: find a 
path through a set of linked nodes (cities) which passes through each 
node once and only once. All of the known solutions to an arbitrary 
Hamiltonian cycle problem are exponential in time (in number of nodes). 
For example, for 5 cities there are at most 120 possible paths, so this 
is an easy one. But for 50 cities there are as many as 49!/2 possible 
paths (how many, exactly, depends on the links between the cities, with 
not every city having all possible links to other cities). For a mere 
100 cities, the number of routes to consider is larger than the number 
of particles we believe to be in the universe.

However, saying known solutions is not the same thing as we have 
proved that it takes exponential time. For all we know, now, in 2002, 
there are solutions not requiring exponential time (in # of cities).

This is also somewhat relevant to theories of everything since we 
might want to ask if somewhere in the set of all possible universes 
there exists one where time travel is possible and computing power 
increases without bound. If the answer is yes, that might suggest that 
any TOE based on all possible computations is too small to 
accomodate a really general notion of all possible universes.

And this general line of reasoning leads to a Many Worlds Version of 
the Fermi Paradox: Why aren't they here?

The reason I lean toward the shut up and calculate or for all 
practical purposes interpretation of quantum mechanics is embodied in 
the above argument.

IF the MWI universe branchings are at all communicatable-with, that is, 
at least _some_ of those universes would have very, very large amounts 
of power, computer power, numbers of people, etc. And some of them, if 
it were possible, would have communicated with us, colonized us, 
visited us, etc.

This is a variant of the Fermi Paradox raised to a very high power.

My conclusion is that the worlds of the MWI are not much different from 
Lewis' worlds with unicorns--possibly extant, but unreachable, and 
hence, operationally, no different from a single universe model.

(I don't believe, necessarily, in certain forms of the Copenhagen 
Interpretation, especially anything about signals propagating 
instantaneously, just the quantum mechanics is about measurables 
ground truth of what we see, what has never failed us, what the 
mathematics tells us and what is experimentally verified. Whether there 
really are (in the modal realism sense of Lewis) other worlds is 
neither here nor there. Naturally, I would be thrilled to see evidence, 
or to conclude myself from deeper principles, that other worlds have 
more than linguistic existence.)


--Tim May



Re: Quantum Probability and Decision Theory

2002-12-30 Thread Brent Meeker
On 30-Dec-02, you wrote:

 Dear Stephen,
...
 [Bruno]It is perhaps up to you to show me a quantum computable
function not
 being classicaly computable. But if you succeed you will give me
 something like an unitary transformation, and then I will show you
 how to write a classical program emulating this unitary
 transformation. See Pour El and Richard's book on how to conceive
 differential equations with non computable solutions (but still
 locally generable by the UD though), but anyway, linear quantum
 waves are classicaly emulable. It is a tedious but conceptually not
 so hard exercise.
 
 
 
I can not see how this is possible given that (as Svozil et at
 state in http://tph.tuwien.ac.at/~svozil/publ/embed.htm ) for
 quantum systems representable by Hilbert spaces of dimension higher
 than two, there does not exist any such valuation s: L® {0,1} ...
 there exist different quantum propositions which cannot be
 distinguished by any classical truth assignment.

This is the Kochen-Specker theorem.  It depends on the existence of an
Hermitean operator to implement any function of variables.  You
should be aware that this problem is avoided by Bohm's intepretation
of quantum mechanics - so it is not inherent in the physics.  See
http://www.math.rutgers.edu/~oldstein/quote.html

Brent Meeker
There is a theory which states that if ever anybody discovers
exactly what the Universe is for and why it is here, it will
instantly disappear and be replaced by something even more
bizarre and inexplicable. There is another theory which states
that this has already happened.
-- Douglas Adams (Hitchhiker's Guide to the Galaxy)




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Hans Moravec
Hal Finney:

there are no known problems which take
exponential time but which can be checked
in polynomial time.  If such a problem could
be found it would prove that P != NP ...


Communications glitch here.  The definition
of NP is problems that can be solved in
polynomial time on a nondeterministic
machine, that is one that can test
simultaneously all candidate solutions
(effectively by creating exponentially
many processes for testing all possible
combinations of undetermined variables,
each individual combination taking polynomial
time to check)

A favorite example is the traveling
salesman problem (TSP), stated as There is
a cycle of length no greater than L that
passes through all N nodes of this graph
exactly once.  (True or False)

There are exponentially (in N) many cycles,
so determining if any one of them has
length less than L would seem to require
exponential time (barring some yet-
undiscovered P=NP trick)

But if an exponential solution finder
also returns, when it answers True, a
cycle it found that had length =L, that
answer can be checked in linear time: just
add up the arc distances.

It is conceivable that an efficient TSP
solver could correctly return Yes for a
given L without being able to identify a
winning cycle.

Checking that answer would then be no easier
than solving the original problem.

By the way, it is known that factoring into
primes is easier than the TSP.  The discovery
of a classical polynomial time algorithm for
factoring would cause much less shock than
one for the TSP (which is NP-complete,
the hardest class of NP problems.  A polynomial
solution for any NP-complete problem can
be mapped into a polynomial solution for all
NP-complete problems, and thus all NP problems,
and factoring).




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Brent Meeker
On 31-Dec-02, Hal Finney wrote:
 
 One correction, there are no known problems which take exponential
 time but which can be checked in polynomial time. If such a problem
 could be found it would prove that P != NP, one of the greatest
 unsolved problems in computability theory.

What about Hamiltonian circuits or factoring an integer or roots of a
Diophantine equation?

Brent Meeker
Just when I thought things couldn't get worse, I installed
Windows XP.
  --- DaveP




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Hal Finney
Brent Meeker wrote:
 On 31-Dec-02, Hal Finney wrote:
  One correction, there are no known problems which take exponential
  time but which can be checked in polynomial time. If such a problem
  could be found it would prove that P != NP, one of the greatest
  unsolved problems in computability theory.

 What about Hamiltonian circuits or factoring an integer or roots of a
 Diophantine equation?

I don't believe any of those are known to take exponential time.  For all
we know a polynomial time solution may yet be found for all of these.

Hal




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Hal Finney
Hans Moravec writes:

 Hal Finney:
  there are no known problems which take
  exponential time but which can be checked
  in polynomial time.  If such a problem could
  be found it would prove that P != NP ...

 Communications glitch here.  The definition
 of NP is problems that can be solved in
 polynomial time on a nondeterministic
 machine, that is one that can test
 simultaneously all candidate solutions
 (effectively by creating exponentially
 many processes for testing all possible
 combinations of undetermined variables,
 each individual combination taking polynomial
 time to check)

I'm not sure if you are disagreeing with either of my statements above,
that (1) there are no known problems which take exponential time but
which can be checked in polynomial time, or that (2) if such a problem
could be found it would prove that P != NP.

 By the way, it is known that factoring into
 primes is easier than the TSP.  The discovery
 of a classical polynomial time algorithm for
 factoring would cause much less shock than
 one for the TSP (which is NP-complete,
 the hardest class of NP problems.

As I understand the state of play, factoring into primes is not known
to be NP-complete, but the contrary is not known, either.  Factoring is
strongly suspected not to be NP-complete but that has not been proven.
So it is still possible that factoring into primes is just as hard as
the TSP, although it is thought to be unlikely (it would imply that
NP = coNP).

Hal Finney




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Tim May

On Monday, December 30, 2002, at 03:46  AM, Brent Meeker wrote:


On 31-Dec-02, Hal Finney wrote:


One correction, there are no known problems which take exponential
time but which can be checked in polynomial time. If such a problem
could be found it would prove that P != NP, one of the greatest
unsolved problems in computability theory.


What about Hamiltonian circuits or factoring an integer or roots of a
Diophantine equation?


Hal will probably answer. I initially had the same reaction you had 
(except only about Hamiltonian cycles and roots of Diophantines, but 
not factoring, as factoring is not known to be NP-complete).

But upon rereading what Hal wrote, and what I had already drafted a 
nearly complete reply to, I saw that he was making the subtle point 
that there are no known problems which take exponential time.

All of the NP-complete problems (Hamiltonian cycle, Diophantine, etc.) 
currently only have exponential-time solutions. But there is no 
guarantee that a polynomial solution (which is not NP, that is, is not 
the result of a guess or an oracle) will not be found sometime in 
the future.

Proving that there are problems which can only be solved in 
exponential time but which can be checked in polynomial time is subtly 
different from saying that all problems in the class NP-complete admit 
no polynomial time solutions.

(I'm trying to avoid using shorthand like P and NP and Exptime, as a 
lot of confusion enters when shorthand gets misinterpreted.)


--Tim May



Re: Many Worlds Version of Fermi Paradox

2002-12-30 Thread Hal Finney
Tim May writes:
 IF the MWI universe branchings are at all communicatable-with, that is, 
 at least _some_ of those universes would have very, very large amounts 
 of power, computer power, numbers of people, etc. And some of them, if 
 it were possible, would have communicated with us, colonized us, 
 visited us, etc.

 This is a variant of the Fermi Paradox raised to a very high power.

A similar idea has been advanced about time travel.  If it will someday
be possible, why aren't we being visited from the future?

However both proposals could be patched up against this objection by
postulating that we have to build the other end of the communication
link.  Time travel will be possible but you won't be able to travel
back to before you built the time machine.  Inter-world travel will
be possible but the people in both worlds have to cooperate to build a
common quantum system, perhaps implying that they have to begin building
the system before their worlds split.

In this formulation (which I believe arises quite naturally in the general
relativity based approaches to time travel), the only way that visitors
could be here now would be if there were some natural phenomenon which
happened to provide the necessary conditions, or if there wre some other
civilization in our local universe who had built the required machinery.

Hal Finney




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Hans Moravec
Hal Finney:

I'm not sure if you are disagreeing with either of my statements above,
that (1) there are no known problems which take exponential time but
which can be checked in polynomial time, or that (2) if such a problem
could be found it would prove that P != NP.


Ah, I see the communications glitch was at my end!

You were being precise: NP-complete is not
known (for sure) to be exponentially hard,
so NP-complete problems are not counterexamples
to statement (1)

But maybe (2) doesn't follow.
There must be problems where checking some
negative cases (in the TSP sense) takes more
than polynomial time (maybe doesn't terminate),
but positive answers can be checked in polynomial
time.  Those would fit the criteria, but not
be in NP. (Trying to make one up ...)


As I understand the state of play, factoring into primes is not known
to be NP-complete, but the contrary is not known, either.  Factoring is
strongly suspected not to be NP-complete but that has not been proven.
So it is still possible that factoring into primes is just as hard as
the TSP, although it is thought to be unlikely (it would imply that
NP = coNP).


Right again, though apparently opinions differ about the unknowns:

http://www.math.okstate.edu/~wrightd/crypt/crypt-intro/node23.html
 ... It is suspected but not yet known that factoring is NP-complete.




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Hans Moravec
http://www.math.okstate.edu/~wrightd/crypt/crypt-intro/node23.html
 ... It is suspected but not yet known that factoring is NP-complete.



Of course, if factoring were to be shown NP-complete
and quantum computers could be built to run Shor's
factoring algorithm in polynomial time, then quantum
computers could solve all NP-complete problems in
polynomial time.  Big advance for quantum computation.





Re: Quantum Probability and Decision Theory

2002-12-30 Thread Brent Meeker
On 31-Dec-02, you wrote:

 Hans Moravec writes:
 
 Hal Finney:
 there are no known problems which take
 exponential time but which can be checked
 in polynomial time.  If such a problem could
 be found it would prove that P != NP ...

OK, you mean that *provably* take exponential time.

...
 
 As I understand the state of play, factoring into primes is not
 known to be NP-complete, but the contrary is not known, either.
 Factoring is strongly suspected not to be NP-complete but that has
 not been proven. So it is still possible that factoring into primes
 is just as hard as the TSP, although it is thought to be unlikely
 (it would imply that NP = coNP).

It seems it has been proven recently to be in P:
http://crypto.cs.mcgill.ca/~stiglic/PRIMES_P_FAQ.html#PRIMES

Brent Meeker
One cannot guess the real difficulties of a problem before
having solved it.
   --- Carl Ludwig Siegel




Re: Quantum Probability and Decision Theory

2002-12-30 Thread Hans Moravec
Brent Meeker:

It seems [factoring] has been proven recently to be in P:
http://crypto.cs.mcgill.ca/~stiglic/PRIMES_P_FAQ.html#PRIMES


No, that's primality testing, which has always been
much easier than factoring.




algorithmic complexity theory

2002-12-30 Thread vznuri

hi all. Ive been poking at computational complexity theory to various
degrees for close to a decade.

the recent comments on the subject are interesting  its not
surprising it has popped up on this list.

I believe complexity theory is going to shed some deep
new light on physics, and has already done so. one area of
intense study is the satisfiability transition point which
has many deep analogies to physics and thermodynamics under
very active investigation. I can cite some refs if
there is interest. 

one of the deepest analogies yet to be explored, I suspect,
is that of entropy.  in physics, 
entropy is a measure of disorder. I suspect in
complexity theory, hardness of a problem is a measure/analog of
entropy. the more disorder involved in the computational function,
the more difficult.

another thing to point out here. while there are no proofs that
P!=NP, there are some good results in computational complexity
theory separating some complexity classes. its a very active
area of research. 

one of my most favorite results Ive been getting
into lately is that it has been proven that some problems require
an exponential # of AND,OR circuit families (by razborov in 1985,
for which he won the nevinlanna prize).  if it could be 
proven for an NP complete problem and AND,OR,NOT gates, then P!=NP.

hans moravec (wow, welcome to the list!!) writes:
By the way, it is known that factoring into
primes is easier than the TSP. 

as HF pointed out, factoring has not been proven to be NP
complete or easier than NP complete 
(it is conjectured to be easier than NP hard) in any sense
as far as I know. this is definitely a very cutting edge
area of research. shor's quantum-P-time factoring algorithm is definitely
one of the very important breakthroughs in this area.

let us note some of the other key open questions. it is
not known if quantum computers can solve NP complete problems
in P time in general. it is not known how to most efficiently
convert an arbitrary algorithm to a quantum algorithm, although
there are hints of this (disordered database lookup, shor's
factoring algorithm, etc)

re: qm computing, I highly recommend julian browns outstanding book
quest for the quantum computer which I recently finished,
much good food for thought for anyone on this list.

many of these themes can be found in the revised theory-edge
FAQ v2.0 (qm computers,satisfiability problem,complexity theory,etc)

http://groups.yahoo.com/group/theory-edge/message/6585/







Re: Quantum Probability and Decision Theory

2002-12-30 Thread Stephen Paul King
Dear Hans, Bret and Friends,

I submit this paper by S. Wolfram on intractability and undecidability
that you all might find of interest:

http://www.stephenwolfram.com/publications/articles/physics/85-undecidabilit
y/2/text.html


Please note the part that discusses coordinate reparametrizations
between finitely specified 4-manifolds ... ;-)

Kindest regards,


Stephen

- Original Message -
From: Hans Moravec [EMAIL PROTECTED]
To: Brent Meeker [EMAIL PROTECTED]; Everything-list
[EMAIL PROTECTED]
Cc: Hans Moravec [EMAIL PROTECTED]
Sent: Monday, December 30, 2002 9:39 PM
Subject: Re: Quantum Probability and Decision Theory


 Brent Meeker:
  It seems [factoring] has been proven recently to be in P:
  http://crypto.cs.mcgill.ca/~stiglic/PRIMES_P_FAQ.html#PRIMES

 No, that's primality testing, which has always been
 much easier than factoring.







Re: Quantum Probability and Decision Theory

2002-12-30 Thread Stephen Paul King
Dear Joao,

Interleaving.

- Original Message -
From: Joao Leao [EMAIL PROTECTED]
To: Ben Goertzel [EMAIL PROTECTED]
Cc: Hal Finney [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Monday, December 30, 2002 2:11 PM
Subject: Re: Quantum Probability and Decision Theory


 There go 7 cents out of my 60!...

 The case indeed is that if you build a quantum computer by emulating
 a Turing-Universal Machine you are a priori circunscribing its own
 class of algorithms. That is only natural if that is the largest class of
 computable problems you think are worthwhile considering. But it
 isn't necessarily the only one. This approach surfaces here and there
 in the literature. See for example:

 http://arXiv.org/abs/quant-ph/0205093


[SPK]

Nice paper! I will be adding this one to my homework. Thank you! What we
need is a good general definition of what exactly is a QM computation that
we can all agree on.

 Another point worth making is that it seems unlikely that the recourse
 to the infinite superposability of quantum states is going to be of any
 help in this circunstance. It may be more profitable to look to
 entanglement (which incidentaly is the trully novelty that QC brings
 to the realm of computation) as the road to a trans-Turing class of
 computations.


[SPK]

Entanglement is somewhat involved. See this paper:

http://www.arxiv.org/abs/quant-ph/0201143


 As to your reference to Penrose, Ben, I should probably add that
 his much maligned ideas concerning the possibility of using Quantum
 Gravity as a basis for understanding the psychology of mathematical
 invention are perhaps worth a second look now that we are learning a
 good deal more about quantum information in Black Holes etc...

[SPK]

I am a deep admirer of Penrose. It was his ideas that awoke me to QM
comp as a possible way of modeling the psyche.

Kindest regards,

Stephen





polynomial vs exponential time problems clarification

2002-12-30 Thread vznuri
hi all. re: the exponential vs polynomial time thread.
imho HFs comments could be interpreted as roughly correct
but stated in a very confusing way  I would say, hence
the ensuing confusion. lets give this another shot.

there are no problems for which it has been proven that
there is a **lower bound** of exponential time except
for those that also require exponential space (for which
the exponential time lower bound is trivial). (space is
the number of tape squares used by a TM, time is the number
of steps). this of course is quite frustrating  even embarrassing
to researchers and a gaping open problem in the theory.

the big P!=NP problem depends on showing that there is some
P-space problem that has a lower bound of exponential time, i.e.
it cannot be solved any faster than exponential time.

literally, there are many problems for which it has been shown or even
proven that they can be verified in P time and can 
be solved in exponential time.

in fact every NP complete problem has this property. what has
not been proven or is not known is that exponential time is
also a **lower bound**.

so it can be very confusing if someone says: there are no
problems that are known to be checkable/verifiable in P time but take
exponential time. the term take must be used very carefully, imho
it should be avoided as just too ambiguous.
sometimes people mean as a lower bound (as HF did below), or 
sometimes it just means a solution exists at that speed (as I write
above).

forget NP complete for a minute  
suppose I have a problem that is solvable in P time. does it
take exponential time? in one sense, yes, there exists also
algorithms that run in exponential time to solve it. but in another
sense, no, that is not the lower bound, the lower bound is polynomial.

also note that defining NP in terms of verification in P time
is done in terms of a regular deterministic TM machine, not 
an nondeterministic one. the sense that NP 
can be defined in terms of nondeterministic
TMs is: it is the set of problems that can be solved by nondeterministic
TMs in P time.

that straightens it all out, right???

there are many different ways to look at the P vs NP problem,
in this way it is like godel's problem, and this can lead
to knowledge in the sense that a little knowledge is a
dangerous thing..

HF writes
 On 31-Dec-02, Hal Finney wrote:
  One correction, there are no known problems which take exponential
  time but which can be checked in polynomial time. If such a problem
  could be found it would prove that P != NP, one of the greatest
  unsolved problems in computability theory.

 What about Hamiltonian circuits or factoring an integer or roots of a
 Diophantine equation?

I don't believe any of those are known to take exponential time.  For all
we know a polynomial time solution may yet be found for all of these.

HM writes

Communications glitch here.  The definition
of NP is problems that can be solved in
polynomial time on a nondeterministic
machine, that is one that can test
simultaneously all candidate solutions
(effectively by creating exponentially
many processes for testing all possible
combinations of undetermined variables,
each individual combination taking polynomial
time to check)