In reply to  Jed Rothwell's message of Fri, 17 Feb 2023 08:42:35 -0500:
Hi,

When considering whether or not it could become dangerous, there may be no 
difference between simulating emotions, and
actually having them.

>Robin <mixent...@aussiebroadband.com.au> wrote:
>
>
>> It's not bonkers, it's lonely. M$ have broken the golden rule of AI and
>> given it a pseudo human personality, and a sense
>> of self. Apparently they learned nothing from "Terminator".
>>
>
>Ha, ha! Seriously, it does not actually have any real intelligence or sense
>of self. Future versions of AI may have these things, but this is just a
>glorified indexing system. A researcher ran an earlier version of this on a
>laptop computer which has no more intelligence than an earthwork, as she
>put it. See the book, "You Look like a Thing and I Love You." The danger is
>that ChatGPT *looks like* it is sentient, and deluded people may think it
>is, and think it has fallen in love with them.
>
>A beehive is a marvel of structural engineering, but a bee knows nothing
>about structural engineering. I do not think the hive as a whole resembles
>the collection of cells and synapses in the brain that give rise to
>the meta-phenomenon of intelligence. Granted, the hive is capable of
>behavior far more complex than solitary bees such as carpenter bees.
Cloud storage:-

Unsafe, Slow, Expensive 

...pick any three.

Reply via email to