On 05 Sep 2012, at 06:14, meekerdb wrote:

On 9/4/2012 7:19 PM, Russell Standish wrote:
On Tue, Sep 04, 2012 at 06:48:58PM -0700, Craig Weinberg wrote:
I have problems with all three of the comp assumptions:

*yes, doctor*: This is really the sleight of hand that props up the entire
thought experiment. If you agree that you are nothing but your brain
function and that your brain function can be replaced by the functioning of non-brain devices, then you have already agreed that human individuality is
a universal commodity.
Calling it a sleight of hand is a bit rough. It is the meat of the
comp assumption, and spelling it out this way makes it very
explicit. Either you agree you can be copied (without feeling a
thing), or you don't. If you do, you must face up to the consequences
of the argument, if you don't, then you do not accept
computationalism, and the consequences of the UDA do not apply to your
worldview.

I suppose I can be copied. But does it follow that I am just the computations in my brain. It seems likely that I also require an outside environment/world with which I interact in order to remain conscious. Bruno passes this off by saying it's just a matter of the level of substitution, perhaps your local environment or even the whole galaxy must be replaced by a digital representation in order to maintain your consciousness unchanged. But this bothers me. Suppose it is the whole galaxy, or the whole observed universe. Does it really mean anything then to say your brain has been replaced ALONG WITH EVERYTHING ELSE? It's just the assertion that everything is computable.

That's a good argument for saying that the level of substitution is not that low. But the reasoning would still go through, and we would lead to a unique computable universe. That is the only way to make a digital physics consistent (as I forget to say sometimes). Then you get a more complex "other mind problem", and something like David Nyman- Hoyle beam would be needed, and would need to be separate from the physical reality, making the big physical whole incomplete, etc. yes this bothers me too. Needless to say, I tend to believe that if comp is true, the level is much higher.





*Church thesis*: Views computation in isolation, irrespective of resources,
supervenience on object-formed computing elements, etc. This is a
theoretical theory of computation, completely divorced from realism from the start. What is it that does the computing? How and why does data enter
or exit a computation?
It is necessarily an abstract mathematical thesis. The latter two
questions simply are relevant.

*Arithmetical Realism*: The idea that truth values are self justifying independently of subjectivity or physics is literally a shot in the dark.
Like yes, doctor, this is really swallowing the cow whole from the
beginning and saying that the internal consistency of arithmetic
constitutes universal supremacy without any real indication of
that.
AR is not just about internal consistency of mathematics, it is an
ontological commitment about the natural numbers. Whatever primitive
reality is, AR implies that the primitive reality models the natural
numbers.

ISTM that Bruno rejects any reality behind the natural numbers (or other system of computation). If often argues that the natural numbers exist, because they satisfy true propositions: There exists a prime number between 1 and 3, therefore 2 exists. This assumes a Platonist view of mathematical objects, which Peter D. Jones has argued against.

? I would say that the contrary is true. It is because natural numbers exists, and seems to obeys laws like addition and multiplication that true propositions can be made on them. 2 exists, and only 1 and 2 divides 2, so 2 is prime, and thus prime numbers exists. 2 itself exists just because Ex(x = s(s(0))) is true. Indeed take x = s(s(0)), and the proposition follows from s(s(0)) = s(s(0)).

Bruno




Brent


In fact, for COMP, and the UDA, Turing completeness of primitive reality is
sufficient, but Bruno chose the natural numbers as his base reality
because it is more familiar to his correspondents.

Wouldn't computers tend to be self-correcting by virtue of the pull toward arithmetic truth within each logic circuit? Where do errors come from?

Again, these two questions seem irrelevant.

Craig

--
You received this message because you are subscribed to the Google Groups "Everything List" group. To view this discussion on the web visit https://groups.google.com/d/msg/everything-list/-/Pc173EEJR4IJ . To post to this group, send email to everything-list@googlegroups.com . To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com . For more options, visit this group at http://groups.google.com/group/everything-list?hl=en .


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com . For more options, visit this group at http://groups.google.com/group/everything-list?hl=en .


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to