Re: [agi] Re: deepmind-co-founder-suggests-new-turing-test-ai-chatbots-report-2023-6

2023-07-05 Thread Matt Mahoney
Let's assume the best case, that the various drugs that are proposed to slow aging by mimicking the metabolism slowing effects of calorie restriction (resveratol, metformin, etc) actually work in humans, have no long term side effects, and everyone starts taking them as a lifetime regimen from birt

Re: [agi] Re: deepmind-co-founder-suggests-new-turing-test-ai-chatbots-report-2023-6

2023-07-05 Thread Rob Freeman
On Wed, Jul 5, 2023 at 7:05 PM Matt Mahoney wrote: >... > LLMs do have something to say about consciousness. If a machine passes the > Turing test, then it is conscious as far as you can tell. I see no reason to accept the Turning test as a definition of consciousness. Who ever suggested that? E

Re: [agi] Re: deepmind-co-founder-suggests-new-turing-test-ai-chatbots-report-2023-6

2023-07-05 Thread immortal . discoveries
Exponential is nice, it's a line of a sort, steady, black and white. But, life isn't all 1+1. AGI will be robotic, it will suddenly exist. It will have freezable memory. It will have a weird yellow dot on its pinky toe who knows. Life is weird. It isn't a line or curb so nicely. Humans may not

Re: [agi] Re: deepmind-co-founder-suggests-new-turing-test-ai-chatbots-report-2023-6

2023-07-05 Thread Matt Mahoney
I am still on the Hutter prize committee and just recently helped evaluate a submission. It uses 1 GB of text because that is how much a human can process over a lifetime. We have much larger LLMs, of course. Their knowledge is equivalent to thousands or millions of humans, which makes them much mo

Re: [agi] Re: deepmind-co-founder-suggests-new-turing-test-ai-chatbots-report-2023-6

2023-07-05 Thread Rob Freeman
On Thu, Jul 6, 2023 at 3:51 AM Matt Mahoney wrote: > > I am still on the Hutter prize committee and just recently helped evaluate a > submission. It uses 1 GB of text because that is how much a human can process > over a lifetime. We have much larger LLMs, of course. Their knowledge is > equiva