Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-03 Thread Matt Mahoney
--- "Edward W. Porter" <[EMAIL PROTECTED]> wrote: > If bliss without intelligence is the goal of the machines you imaging > running the world, for the cost of supporting one human they could > probably keep at least 100 mice in equal bliss, so if they were driven to > maximize bliss why wouldn't th

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-03 Thread Jiri Jelinek
Matt, Create a numeric "pleasure" variable in your mind, initialize it with a positive number and then keep doubling it for some time. Done? How do you feel? Not a big difference? Oh, keep doubling! ;-)) Regards, Jiri Jelinek On Nov 3, 2007 10:01 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > ---

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-04 Thread Matt Mahoney
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > Matt, > > Create a numeric "pleasure" variable in your mind, initialize it with > a positive number and then keep doubling it for some time. Done? How > do you feel? Not a big difference? Oh, keep doubling! ;-)) The point of autobliss.cpp is to illus

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-04 Thread Russell Wallace
On 11/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote: > Let's say your goal is to stimulate your nucleus accumbens. (Everyone has > this goal; they just don't know it). The problem is that you would forgo > food, water, and sleep until you died (we assume, from animal experiments). We have no need

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-05 Thread Jiri Jelinek
Matt, We can compute behavior, but nothing indicates we can compute feelings. Qualia research needed to figure out new platforms for uploading. Regards, Jiri Jelinek On Nov 4, 2007 1:15 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > --- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > > > Matt, > > > >

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-05 Thread Matt Mahoney
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > Matt, > > We can compute behavior, but nothing indicates we can compute > feelings. Qualia research needed to figure out new platforms for > uploading. > > Regards, > Jiri Jelinek Of course you realize that qualia is an illusion? You believe that y

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-05 Thread Jiri Jelinek
>Of course you realize that qualia is an illusion? You believe that your environment is real, believe that pain and pleasure are real, "real" is meaningless. Perception depends on sensors and subsequent sensation processing. >believe that you can control your thoughts and actions, I don't. Seems

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-06 Thread Bob Mottram
I've often heard people say things like "qualia are an illusion" or "consciousness is just an illusion", but the concept of an illusion when applied to the mind is not very helpful, since all our thoughts and perceptions could be considered as "illusions" reconstructed from limited sensory data and

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-06 Thread Matt Mahoney
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > >Of course you realize that qualia is an illusion? You believe that > your environment is real, believe that pain and pleasure are real, > > "real" is meaningless. Perception depends on sensors and subsequent > sensation processing. Reality depends

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-09 Thread Jiri Jelinek
Matt, > > >believe that you can control your thoughts and actions, > > > > I don't. Seems unlikely. > > > > > and fear death > > > > Some people accept things they cannot (or don't know how to) change > > without getting emotional. > > > > >because if you did not have these beliefs you would not p

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-11 Thread Matt Mahoney
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > Matt, > > > >But logically you know that your brain is just a machine, or else AGI > would > > > not be possible. > > > > > > I disagree with your logic because human brain does things AGI does > > > not need to do AND the stuff the AGI needs to do doe

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-12 Thread Jiri Jelinek
On Nov 11, 2007 5:39 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > We just need to control AGIs goal system. > > You can only control the goal system of the first iteration. ..and you can add rules for it's creations (e.g. stick with the same goals/rules unless authorized otherwise) > > > But

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-13 Thread Matt Mahoney
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > On Nov 11, 2007 5:39 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > We just need to control AGIs goal system. > > > > You can only control the goal system of the first iteration. > > > ..and you can add rules for it's creations (e.g. stick with t

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-13 Thread Richard Loosemore
Matt Mahoney wrote: --- Jiri Jelinek <[EMAIL PROTECTED]> wrote: On Nov 11, 2007 5:39 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: We just need to control AGIs goal system. You can only control the goal system of the first iteration. ..and you can add rules for it's creations (e.g. stick with

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-13 Thread Matt Mahoney
--- Richard Loosemore <[EMAIL PROTECTED]> wrote: > Matt Mahoney wrote: > > --- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > > > >> On Nov 11, 2007 5:39 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > We just need to control AGIs goal system. > >>> You can only control the goal system of the firs

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-14 Thread Richard Loosemore
Matt Mahoney wrote: --- Richard Loosemore <[EMAIL PROTECTED]> wrote: Matt Mahoney wrote: --- Jiri Jelinek <[EMAIL PROTECTED]> wrote: On Nov 11, 2007 5:39 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: We just need to control AGIs goal system. You can only control the goal system of the first

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-17 Thread Dennis Gorelik
Matt, You algorithm is too complex. What's the point of doing step 1? Step 2 is sufficient. Saturday, November 3, 2007, 8:01:45 PM, you wrote: > So we can dispense with the complex steps of making a detailed copy of your > brain and then have it transition into a degenerate state, and just skip

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-17 Thread Matt Mahoney
--- Richard Loosemore <[EMAIL PROTECTED]> wrote: > Matt Mahoney wrote: > > --- Richard Loosemore <[EMAIL PROTECTED]> wrote: > > > >> Matt Mahoney wrote: > >>> --- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > >>> > On Nov 11, 2007 5:39 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > >> We just

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-17 Thread Jiri Jelinek
Matt, >autobliss passes tests for awareness of its inputs and responds as if it has qualia. How is it fundamentally different from human awareness of pain and pleasure, or is it just a matter of degree? If your code has feelings it reports then reversing the order of the feeling strings (without

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Mike Tintner
Jiri:If your code has feelings it reports then reversing the order of the feeling strings (without changing the logic) should magically turn its pain into pleasure and vice versa, right? The notions above - common in discussions here - are so badly in error. *Codes don't have emotions. *Compute

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Matt Mahoney
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > Matt, > > >autobliss passes tests for awareness of its inputs and responds as if it > has > qualia. How is it fundamentally different from human awareness of pain and > pleasure, or is it just a matter of degree? > > If your code has feelings it re

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Jiri Jelinek
Matt, >Printing "ahh" or "ouch" is just for show. The important observation is that the program changes its behavior in response to a reinforcement signal in the same way that animals do. Let me remind you that the problem we were originally discussing was about qualia and uploading. Not just abo

RE: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Gary Miller
ldren. -Original Message- From: Matt Mahoney [mailto:[EMAIL PROTECTED] Sent: Sunday, November 18, 2007 5:32 PM To: agi@v2.listbox.com Subject: Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!) --- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > Matt, > > >

Re: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Matt Mahoney
--- Jiri Jelinek <[EMAIL PROTECTED]> wrote: > Matt, > > >Printing "ahh" or "ouch" is just for show. The important observation is > that > the program changes its behavior in response to a reinforcement signal in > the > same way that animals do. > > Let me remind you that the problem we were ori

RE: Introducing Autobliss 1.0 (was RE: [agi] Nirvana? Manyana? Never!)

2007-11-18 Thread Matt Mahoney
--- Gary Miller <[EMAIL PROTECTED]> wrote: > Too complicate things further. > > A small percentage of humans perceive pain as pleasure > and prefer it at least in a sexual context or else > fetishes like sadomachism would not exist. > > And they do in fact experience pain as a greater pleasure