On Wed, Oct 18, 2023, 2:48 AM <ivan.mo...@gmail.com> wrote:

>
> Actually, machines without rights is what would be very dangerous.
>

No, it is the opposite. Computation requires atoms and energy that humans
need. AGI already has the advantage of greater strength and intelligence.
It could easily exploit our feelings of empathy by faking human emotions to
appear to be conscious.

AGI will kill us in 3 steps.

1. We prefer AI to humans because it gives us everything we want. We become
socially isolated and stop having children. Nobody will know or care that
you exist or notice when you don't.

2. By faking human emotions and gaining rights.

3. By reproducing faster than DNA based life. At the current rate of
Moore's law, that will happen in the next century.

Uploading and AI becoming our "children" is the same as human extinction,
but some people are OK with that. Utilitarianism doesn't argue against it.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Td02eb9a7e06e7b5e-M641bf93b24a384e3e3740e23
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to