On Saturday, August 16, 2014 11:26:08 PM UTC+10, jessem wrote:
> On Sat, Aug 16, 2014 at 12:48 AM, Pierz <pie...@gmail.com> wrote:
> 
> 
> 
> 
> On Saturday, August 16, 2014 2:28:32 PM UTC+10, jessem wrote:
> 
> 
> 
> 
> 
> 
> On Fri, Aug 15, 2014 at 11:09 PM, meekerdb <meek...@verizon.net> wrote:
> 
> 
>   
>     
>   
>   
> 
> 
> 
> 
>     
> On 8/15/2014 5:30 PM, Jesse Mazer
>       wrote:
> 
>     
>     
> 
> 
>       
> 
>         
> 
> 
>           
> 
>           
> On Fri, Aug 15, 2014 at 1:27 AM,
>             Russell Standish <li...@hpcoders.com.au>
>             wrote:
> 
>             
>               
> On Thu, Aug 14, 2014 at 09:41:00PM -0700,
>                 meekerdb wrote:
> 
>                 > On 8/14/2014 8:32 PM, Russell Standish wrote:
> 
>                 > >On Thu, Aug 14, 2014 at 08:12:30PM -0700,
>                 meekerdb wrote:
> 
>                 > >>That does seem strange, but I don't know
>                 that it strikes me as
> 
>                 > >>*absurd*.  Isn't it clearer that a
>                 recording is not a computation?
> 
>                 > >>And so if consciousness supervened on a
>                 recording it would prove
> 
>                 > >>that consciousness did not require
>                 computation?
> 
>                 > >>
> 
>                 > >To be precise "supervening on the playback of a
>                 recording". Playback
> 
>                 > >of a recording _is_ a computation too, just a
>                 rather simple one.
> 
>                 > >
> 
>                 > >In other words:
> 
>                 > >
> 
>                 > >#include <stdio.h>
> 
>                 > >int main()
> 
>                 > >{
> 
>                 > >   printf("hello world!\n");
> 
>                 > >   return 1;
> 
>                 > >}
> 
>                 > >
> 
>                 > >is very much a computer program (and a playback
>                 of recording of the
> 
>                 > >words "hello world" when run). I could change
>                 "hello world" to the contents of
> 
>                 > >Wikipedia, to illustrate the point more
>                 forcibly.
> 
>                 > OK.  So do you think consciousness supervenes on
>                 such a simple
> 
>                 > computation - one that's functionally identical
>                 with a recording? Or
> 
>                 > does instantiating consciousness require some
>                 degree of complexity
> 
>                 > such that CC comes into play?
> 
>                 >
> 
>                 
> 
>               
>               My opinion on whether the recording is conscious or not
>               aint worth a
> 
>               penny.
> 
>               
> 
>               Nevertheless, the definition of computational
>               supervenience requires
> 
>               countefactual correctness in the class of programs being
>               supervened
> 
>               on.
> 
>               
> 
>               AFAICT, the main motivation for that is to prevent
>               recordings being conscious.
>             
> 
> 
>             
>             
> I think it is possible to have a different definition
>               of when a computation is "instantiated" in the physical
>               world that prevents recordings from being conscious, a
>               solution which doesn't actually depend on counterfactuals
>               at all. I described it in the post at 
> http://www.mail-archive.com/everything-list@googlegroups.com/msg16244.html
>                (or 
> https://groups.google.com/d/msg/everything-list/GC6bwqCqsfQ/rFvg1dnKoWMJ
>               on google groups). Basically the idea is that in any
>               system following mathematical rules, including both
>               abstract Turing machines and the physical universe,
>               everything about its mathematical structure can be encoded
>               as a (possibly infinite) set of logical propositions. So
>               if you have a Turing machine running whose computations
>               over some finite period are supposed to correspond to a
>               particular "observer moment", you can take all the
>               propositions dealing with the Turing machine's behavior
>               during that period (propositions like "on time-increment
>               107234320 the read/write head moved to square 2398311 and
>               changed the digit there from 0 to 1, and changed its
>               internal state from M to Q"), and look at the structure of
>               logical relations between them (like "proposition A and B
>               together imply proposition C, proposition B and C together
>               do not imply A", etc.). Then for any other computation or
>               even any physical process, you can see if it's possible to
>               find a set of propositions with a completely *isomorphic*
>               logical structure. 
>           
>         
>       
>     
>     
> 
> 
> 
>     But physical processes don't have *logical* structure.  Theories of
>     physical processes do, but I don't think that serves your purpose.
> 
> 
> 
> 
> Propositions about physical processes have a logical structure, don't they? 
> And wouldn't such propositions--if properly defined using variables that 
> appear in whatever the correct fundamental theory turns out to be--have 
> objective truth-values?
> 
> 
> 
> 
> Also, would you say physical processes don't have a mathematical structure? 
> If you would say that, what sort of "structure" would you say they *do* have, 
> given that we have no way of empirically measuring any properties other than 
> ones with mathematical values? Any talk of physical properties beyond 
> mathematical ones gets into the territory of some kind of "thing-in-itself" 
> beyond all human comprehension.
> 
> 
> 
> 
>  
>  
>     And even restricting the domain to Turing machines, I don't see what
>     proposition A and proposition B are?
> 
> 
> They could be propositions about basic "events" in the course of the 
> computation--state changes of the Turing machine and string on each 
> time-step, like the example I gave "on time-increment 107234320 the 
> read/write head moved to square 2398311 and changed the digit there from 0 to 
> 1, and changed its internal state from M to Q". There would also have to be 
> propositions for the general rules followed by the Turing machine, like "if 
> the read/write head arrives at a square with a 1 and the machine's internal 
> state is P, change the 1 to a 0, change the internal state to S, and advance 
> along the tape by 3 squares".
> 
> 
> 
> 
> 
> 
>  
>   Aren't they just they
>     transition diagram of the Turing machine?  So if the Turing machine
>     goes thru the same set of states that set defines an equivalence
>     class of computations.  But what about a different Turing machine
>     that computes the same function?  It may not go thru the same states
>     even for the same input and output.  In fact there is one such
>     Turing machine that just executes the recording.  Right?
> 
> 
> 
> What I'm imagining here is that if there is a true mathematical theory of 
> consciousness of the kind David Chalmers imagines, it would define distinct 
> observer-moments in terms of distinct logical networks, not merely in terms 
> of "functions" defined solely in terms of input-output relations. Obviously I 
> can't prove this, but the advantage is that it would preserve most of the 
> features that make computational theories of mind appealing--see the 
> discussion by Chalmers at http://consc.net/papers/qualia.html of problems 
> that arise when you reject computationalism and imagine that the particular 
> type of matter doing the computation is important for conscious experience 
> (as John Searle supposes), for instance--while at the same time avoiding the 
> types of pitfalls about 'instantiation' pointed out by the movie-graph 
> argument and Maudlin's Olympia. 
> 
> 
> 
> 
> If correct input-output relations were sufficient for consciousness, then 
> lookup tables--which can just be giant libraries of previously recorded 
> computations, showing responses of the computed being (a mind upload, say) to 
> all possible series of inputs, starting from a given initial state--would 
> have to be conscious too. But my hunch is that replaying a recording from a 
> lookup table doesn't count as an "instantiation" of the observer-moment which 
> is being replayed, and doesn't increase its measure.
> 
> 
> 
> Lookup tables are just computations performed in the past and stored for 
> reasons of convenience.
> 
> 
> Yes, that was my point, that I think it's a mistake to define 
> observer-moments solely in terms of computations defined as *functions* which 
> relate certain inputs to certain outputs--I think the actual computational 
> process used in going from input to output would matter in terms of what is 
> experienced. My alternate definition in terms of the logical structure of a 
> computation would take care of that.
> 
> 
> 
>  
> 
>  If a device employed lookup tables correctly for all possible inputs, then 
> it would be conscious (according to comp) because the time at which the 
> computation was carried out is irrelevant. And there is a further point to be 
> made. The device must choose which lookup table to consult. Therefore it must 
> be intelligent (make if-then decisions). And the kicker is that if it it's to 
> be truly as responsive in its use of those tables as a device not using them, 
> then it will end up having to do just as much computation finding the right 
> table as it would just to perform the calculation! There is no free lunch 
> here. I assert this confidently on the basis of my intuitions as a 
> programmer, without being able to rigorously prove it, but a short thought 
> experiment should get halfway to proving it. Imagine a lookup table of all 
> possible additions of two numbers up to some number n. First I calculate all 
> the possible results and put them into a large n by n table. Now I'm asked 
> what is the sum of say 10 and 70. So I go across to row 10 and column 70 and 
> read out the number 80. But in doing so, I've had to count to 10 and to 70! 
> So I've added the two numbers together finding the correct value to look up! 
> I'm sure the same equivalence could be proven to apply in all analogous 
> situations.
> 
> 
> 
> I think you're being misled by the particular example you chose involving 
> addition, in general there is no principle that says finding the appropriate 
> entry in a lookup table involves a computation just as complicated as the 
> original computation without a lookup table. Suppose instead of addition, the 
> lookup table is based on a Turing test type situation where an intelligent AI 
> is asked to respond to textual input, and the lookup table is created by 
> doing a vast number of runs, all starting from the same initial state but 
> feeding the AI *all* possible strings of characters under a certain length 
> (the vast majority will just be nonsense of course). Then all the possible 
> input strings can be stored alphabetically, and if I interact with the lookup 
> table by typing a series of comment to the AI, it just has to search through 
> the recordings alphabetically to find one where the AI responded to that 
> particular comment (after responding to my previous comments which constitute 
> the earlier parts of the input string), it doesn't need to re-compute the 
> AI's brain processes or anything like that. And ultimately regardless of the 
> type of program, the "input" will be encoded as some string of 1's and 0's, 
> so for *all* lookup tables the possible input strings can be stored in 
> numerical order, analogous to alphabetical order for verbal statement

No of course, a lookup table can help, as I went on to say a few minutes later 
in a different reply when I realized the mistake. But I've explained in my 
longer reply to Liz what I was trying to say here. It depends on what level we 
wish to simulate to. A mere lookup table of outer behaviours such as speech 
acts won't be sufficient for a complete simulation. The more fine grained and 
responsive I wish to make my simulation, the more computation will be required 
to select the correct recordings, and the shorter and shallower the recordings 
will be. But read my reply to Liz. Hopefully I explain myself better there.
> 
> 
> 
> Jesse

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to