Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-17 Thread magnuswootton81
Im just saying it makes no difference to me,   ai or no ai.  =) hehe
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-Maa29adc5517abebcebd65a53
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-16 Thread Boris Kazachenko
Yeah, let's get deeper into that bullshit. Because there's not real work to be 
done, right? 
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-Mb0ae68a6c814a412b382f763
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-16 Thread Ben Goertzel via AGI
On Thu, Sep 16, 2021 at 12:56 AM  wrote:
>
> We are already replaced.   God doesnt need any of us.


Dude.  Each of us is a fractal image and portion of "God".   Get your
eschatology straight my friend ;)

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-Me2a13baf412a170cad0e4b35
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-16 Thread magnuswootton81
We are already replaced.   God doesnt need any of us.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-Ma632c2ae5435ef5b0a6ba748
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-15 Thread immortal . discoveries
His point was even if here's a twin machine of you, you still naught want to 
die, you want to stop the replacement.

We *will *make true AGI though, it will be just like us soon, maybe by 2030, it 
will simply be made of different materials that's all.

It will replace us for all tasks ex. coming up with medicines, microscopes, 
HDDs, AI methods, etc.

I will make sure it replaces us. I'm hardcore.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-M11b6f34b6daaeb2aaf1d0d73
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-15 Thread WriterOfMinds
On Wednesday, September 15, 2021, at 9:08 AM, Matt Mahoney wrote:
> Here is a robot that looks and acts like you as far as anyone can tell, 
> except that it is younger, healthier, stronger, smarter, upgradable, immortal 
> through backups, and it has super powers like infrared vision and wireless 
> internet. Here is a gun. Would you like to shoot yourself now to complete the 
> upload?

Why on earth would I shoot myself? If one of me in the world is a good thing, 
two is even better. My robot twin and I are instant best friends. If she takes 
my job, I just go do something else (I have far more things to do than time to 
do them!). We both enjoy our increased productivity and progress toward shared 
goals until I die of old age.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-M0c58803b38fd300c824df28d
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-15 Thread Matt Mahoney
Machines already do 99% of work, as measured by global economic
productivity relative to the price of food in 1800.

And machines are not slaves. We abolished slavery because it was cruel to
humans. Machines are not human, even if we can make them look like humans,
pass the Turing test, and mimic suffering. Machines are tools that increase
human productivity and make our lives easier.

The risk of giving human rights to machines is that they will replace us
and we will go along with it. It doesn't eliminate other risks, like self
replicating nanotechnology and engineered pathogens once everyone can buy
cheap 3D nanoscale printers.

Here is a robot that looks and acts like you as far as anyone can tell,
except that it is younger, healthier, stronger, smarter, upgradable,
immortal through backups, and it has super powers like infrared vision and
wireless internet. Here is a gun. Would you like to shoot yourself now to
complete the upload?

Which world do you want to live and die in?

On Tue, Sep 14, 2021, 6:05 PM TimTyler  wrote:

> On 2021-09-09 23:21:PM, Matt Mahoney wrote:
> > It would be existentially dangerous to make AGI so much
> > like humans that we give human rights to competing
> > machines more powerful than us.
>
> Not having much in the way of human rights did not prevent slaves
> from thriving during the era of slavery. Machines will still be able
> to perform useful work without human rights. That's all they need
> to be able to take 50%, 90% and then 99% of all productive work.
> I think not awarding machines human rights will have at most a
> modest impact on the progress of the ongoing machine takeover.
>
> The idea that we can keep control of machines by not giving them
> rights seems very dubious to me. I don't think that it will be effective.
> We have tried using slavery before. It didn't work out the way we planned.
>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-M09dd310d8574010eac1fd77f
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-14 Thread TimTyler

On 2021-09-09 23:21:PM, Matt Mahoney wrote:

It would be existentially dangerous to make AGI so much
like humans that we give human rights to competing
machines more powerful than us.


Not having much in the way of human rights did not prevent slaves
from thriving during the era of slavery. Machines will still be able
to perform useful work without human rights. That's all they need
to be able to take 50%, 90% and then 99% of all productive work.
I think not awarding machines human rights will have at most a
modest impact on the progress of the ongoing machine takeover.

The idea that we can keep control of machines by not giving them
rights seems very dubious to me. I don't think that it will be effective.
We have tried using slavery before. It didn't work out the way we planned.

--
__
 |im |yler http://timtyler.org/  t...@tt1.org
 



--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-Med3682fe7b85adf4bbbf3d45
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-09 Thread Matt Mahoney
If you program your AGI to positively reinforce input, learning, and
output, will it develop senses of qualia, consciousness, and free will? I
mean in the sense that it is motivated like we are to preserve the reward
signal by not dying. Do we need this in AGI, or can it learn a model of the
human mind that has these signals without having the signals itself?

I think it is possible and would be safer to build an AGI that can model
the survival instinct in humans without having a survival instinct itself.
It would be existentially dangerous to make AGI so much like humans that we
give human rights to competing machines more powerful than us.

On Thu, Sep 9, 2021, 4:11 PM Ben Goertzel via AGI 
wrote:

> Theme for discussion this week: Characterizing and Implementing
> Human-Like Consciousness
>
> See
>
> https://wiki.opencog.org/w/AGI_Discussion_Forum#Sessions
>
> URL for video-chat: https://singularitynet.zoom.us/my/benbot ...
>
> Background reading:
>
> https://www.researchgate.net/publication/275541457_haracterizing_Human-like_Consciousness_An_Integrative_Approach
>
> Pretty soon we will have some new Hyperon design documents/ideas to
> discuss in the AGI Discussion Group -- lots of progress occurring on
> that front, but much of it not quite yet fully baked enough for
> public-ish discussion ... so for this session we're going to explore
> some highly relevant but more general topics... ;)
>
>
> On Thu, Aug 26, 2021 at 9:22 AM Mike Archbold  wrote:
> >
> > Is there a discussion tomorrow?
> >
> > On 8/16/21, magnuswootto...@gmail.com  wrote:
> > > Brute forcing is about reducing the search in a way,  to spread it out
> > > further, the same amount of computation power.
> 
> 
> --
> Ben Goertzel, PhD
> http://goertzel.org
> 
> “He not busy being born is busy dying" -- Bob Dylan

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-M740e3ee06af515a42129bd38
Delivery options: https://agi.topicbox.com/groups/agi/subscription


[agi] AGI discussion group, Sep 10 7AM Pacific: Characterizing and Implementing Human-Like Consciousness

2021-09-09 Thread Ben Goertzel via AGI
Theme for discussion this week: Characterizing and Implementing
Human-Like Consciousness

See

https://wiki.opencog.org/w/AGI_Discussion_Forum#Sessions

URL for video-chat: https://singularitynet.zoom.us/my/benbot ...

Background reading:
https://www.researchgate.net/publication/275541457_haracterizing_Human-like_Consciousness_An_Integrative_Approach

Pretty soon we will have some new Hyperon design documents/ideas to
discuss in the AGI Discussion Group -- lots of progress occurring on
that front, but much of it not quite yet fully baked enough for
public-ish discussion ... so for this session we're going to explore
some highly relevant but more general topics... ;)


On Thu, Aug 26, 2021 at 9:22 AM Mike Archbold  wrote:
>
> Is there a discussion tomorrow?
>
> On 8/16/21, magnuswootto...@gmail.com  wrote:
> > Brute forcing is about reducing the search in a way,  to spread it out
> > further, the same amount of computation power.



-- 
Ben Goertzel, PhD
http://goertzel.org

“He not busy being born is busy dying" -- Bob Dylan

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5e30c339c3bfa713-Mcfc31ad8c8d6ac01e35f3152
Delivery options: https://agi.topicbox.com/groups/agi/subscription