Dear Brent,

  If there exit an infinite number of observers and similarities in the 1p
content of those observers is a priori possible, it follows that there will
be "regularities" as those are the similarities that observers share.
  The "brain in a vat" thought experiment is an attempt to ask competent
questions as to what is reality. It also allows us to consider possible
relations between the complexity of the 1p content and resources required
to generate (computationally) said content.


On Sat, Jan 11, 2014 at 5:09 PM, meekerdb <meeke...@verizon.net> wrote:

>  On 1/10/2014 11:29 PM, Stephen Paul King wrote:
>
>  Dear Brent,
>
>    "Hmm?  Steven turns into a White Rabbit is not a *logical* contradiction,
> it's a *nomological* one.  If there's a transition from (t1,x1) to
> (t2,x2) it seems the only *logical* contradiction would be x2="Not x1 at
> t1."  "Logical" is a very weak condition; as far as I know it just means
> being consistent=(not every sentence is a theorem)."
>
>  nom·o·log·i·cal
>  ˌnäməˈläjikəl/
>  *adjective*
>
>    1.  *1*.
>     relating to or denoting certain principles, such as laws of nature,
>    that are neither logically necessary nor theoretically explicable, but are
>    simply taken as true.
>
>
>    Right! It was a very crude and informal explanation.
>
>
> But it's not an explanation at all if it assumes regularities "such as
> laws of nature" because those are what we're trying to explain.
>
>
>  Things become, hopefully, more clear when one considers the scenario
> where there are many minds that are communicating/interacting while
> evolving. Interaction requires some level of  similarity between the
> participants.
>   For example, I I where to experience a White Rabbit, what effects would
> it have to have on others that I interact with so that it would not effect
> their 1p content. I would say that it was a hallucination, maybe... We
> forget that what we experience of the world is not that world itself, it is
> our mind/brains version of such. We have to take the capacity
> of hallucinations into account in our thoughts of that is a mind...
>    Can we not take as "true" what we experience? How can we know that it
> is not some controlled simulation? We need to answer Descartes question:
> How do I know that I am not just a brain in a vat (or a computation running
> in some UD)?
>
>
> Sure, we can take our experiences as "true" in the sense that we know we
> have the experience.  Then we reason from there and hypothesize models of
> the world to make sense of our experiences.  But I think it makes no sense
> to speculate that I am brain in a vat - what difference would it make?  My
> model for understanding my experiences could only include that by assuming
> much more about the vat, the stimulus to my brain, who made the
> vat,....it's just a lot of extra complication with no predictive power.
>
> Brent
>
>  --
> You received this message because you are subscribed to a topic in the
> Google Groups "Everything List" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/everything-list/TBc_y2MZV5c/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> everything-list+unsubscr...@googlegroups.com.
> To post to this group, send email to everything-list@googlegroups.com.
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/groups/opt_out.
>



-- 

Kindest Regards,

Stephen Paul King

Senior Researcher

Mobile: (864) 567-3099

stephe...@provensecure.com

 http://www.provensecure.us/


“This message (including any attachments) is intended only for the use of
the individual or entity to which it is addressed, and may contain
information that is non-public, proprietary, privileged, confidential and
exempt from disclosure under applicable law or may be constituted as
attorney work product. If you are not the intended recipient, you are
hereby notified that any use, dissemination, distribution, or copying of
this communication is strictly prohibited. If you have received this
message in error, notify sender immediately and delete this message
immediately.”

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to