It seems that the Church-Turing thesis, that states that an universal turing
machine can compute everything that is intuitively computable, has near
universal acceptance among computer scientists.

I really wonder why this is so, given that there are simple cases where we
can compute something that an abitrary turing machine can not compute using
a notion of computation that is not extraordinary at all (and quite relevant
in reality).
For example, given you have a universal turing machine A that uses the
alphabet {1,0} and a universal turing machine B that uses the alphabet
{-1,0,1}.
Now it is quite clear that the machine A cannot directly answer any
questions that relates to -1. For example it cannot directly compute
-1*-1=1. Machine A can only be used to use an encoded input value and
encoded description of machine B, and give an output that is correct given
the right decoding scheme.
But for me this already makes clear that machine A is less computationally
powerful than machine B. Its input and output when emulating B do only make
sense with respect to what the machine B does if we already know what
machine B does, and if it is known how we chose to reflect this in the input
of machine A (and the interpretation of its output). Otherwise we have no
way of even saying whether it emulates something, or whether it is just
doing a particular computation on the alphabet {1,0}.
I realize that it all comes down to the notion of computation. But why do
most choose to use such a weak notion of computation? How does machine B not
compute something that A doesn't by any reasonable standard?
Saying that A can compute what B computes is like saying that "orange" can
express the same as the word "apple", because we can encode the word "apple"
as "orange". It is true in a very limited sense, but it seems mad to treat
it as the foundation of what it means for words to express something (and
the same goes for computation).
If we use such trivial notions of computation, why not say that the program
"return input" emulates all turing-machines because given the right input it
gives the right output (we just give it the solution as input). 
I get that we can simply use the Church-turing as the definition of
computation means. But why is it (mostly) treated as being the one and only
correct notion of computation (especially in a computer science context)?
The only explanation I have is that it is dogma. To question it would change
to much and would be too "complicated" and uncomfortable. It would make
computation an irreducibly complex and relative notion or - heaven forbid -
even an inherently subjective notion (computation from which perspective?).
-- 
View this message in context: 
http://old.nabble.com/Why-the-Church-Turing-thesis--tp34348236p34348236.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to