Re: [singularity] critiques of Eliezer's views on AI

2007-07-03 Thread MindInstance
Objective observers care only about the type of a person and whether it's instantiated, not about the fate of its instances (because, frankly, they're not aware of the difference between the type and an instance). But since I know better, I would be sad about dead instances. The point is whether

Re: [singularity] critiques of Eliezer's views on AI

2007-07-03 Thread Stathis Papaioannou
On 04/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote: That definition isn't accurate, because it doesn't match what we intuitively see as 'death'. 'Death' is actually fairly easy to define, compared to "good" or even "truth"; I would define it as the permanent destruction of a large portion of the i

Re: [singularity] critiques of Eliezer's views on AI

2007-07-03 Thread Tom McCabe
That definition isn't accurate, because it doesn't match what we intuitively see as 'death'. 'Death' is actually fairly easy to define, compared to "good" or even "truth"; I would define it as the permanent destruction of a large portion of the information that makes up a sentient being's mind. -

Re: [singularity] critiques of Eliezer's views on AI

2007-07-03 Thread Stathis Papaioannou
On 04/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote: Using that definition, everyone would die at an age of a few months, because the brain's matter is regularly replaced by new organic chemicals. I know that, which is why I asked the question. It's easy enough to give a precise and objective def

RE: [singularity] AI concerns

2007-07-03 Thread Tom McCabe
--- Sergey Novitsky <[EMAIL PROTECTED]> wrote: > >Governments do not have a history of realizing the > >power of technology before it comes on the market. > > But this was not so with nuclear weapons... It was the physicists who first became aware of the power of nukes, and the physicists had t

RE: [singularity] AI concerns

2007-07-03 Thread Sergey Novitsky
Governments do not have a history of realizing the power of technology before it comes on the market. But this was not so with nuclear weapons... And with AGI, it's about something that has the potential to overthrow the world order (or at least the order within a single country). Would not the

Re: [singularity] critiques of Eliezer's views on AI

2007-07-03 Thread Tom McCabe
Using that definition, everyone would die at an age of a few months, because the brain's matter is regularly replaced by new organic chemicals. - Tom --- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > On 30/06/07, Heartland <[EMAIL PROTECTED]> > wrote: > > > Objective observers care only abo

RE: [singularity] AI concerns

2007-07-03 Thread Tom McCabe
--- "Sergey A. Novitsky" <[EMAIL PROTECTED]> wrote: > >> > >>Are these questions, statement, opinions, sound > bites or what? It seem a > >>bit of a stew. > Yes. A bit of everything indeed. Thanks for noting > the incoherency. > > >>> * As it already happened with nuclear > weapons, there ma

Re: [singularity] critiques of Eliezer's views on AI

2007-07-03 Thread Stathis Papaioannou
On 30/06/07, Heartland <[EMAIL PROTECTED]> wrote: Objective observers care only about the type of a person and whether it's intantiated, not about the fate of its instances (because, frankly, they're not aware of the difference between the type and an instance). But since I know better, I would

RE: [singularity] AI concerns

2007-07-03 Thread Sergey A. Novitsky
>> >>Are these questions, statement, opinions, sound bites or what? It seem a >>bit of a stew. Yes. A bit of everything indeed. Thanks for noting the incoherency. >>> * As it already happened with nuclear weapons, there may be >>> treaties constraining AI development. >>> >> >>Well we ha