Re: It's too late to stop GPT4 now

2023-04-08 Thread Russell Standish
On Sat, Apr 08, 2023 at 03:11:47PM -0400, John Clark wrote:
> On Sat, Apr 8, 2023 at 8:19 AM Russell Standish  wrote:
> 
> 
> > Don't forget it requires a society of hundreds of millions of human
> level intelligences to make a GPT-4. And it take a human level 
> intelligence
> some 20 years in order to make
> meaningful contributions to something like GPT-4.
> Progress will therefore continue to be be exponential for some time to
> come. Only when super human intelligence is able to design itself will
> hyperbolic progress begin. 
> 
> 
> Although certainly extremely helpful most areas of science require more than
> just a brilliant theoretician, they need experimental evidence, and so new
> knowledge in those fields will not grow at the same explosive rate as computer
> intelligence does; however there are two fields that do not require experiment
> evidence and so should grow as rapidly as intelligence does, mathematics and
> software development, including smart software they can write even smarter
> software. And there are mountains of data on physics and biology that already
> exist and they're almost certainly unknown gems hiding in there that nobody 
> has
> spotted, but with new mathematical techniques and better software they could 
> be
> found.
>

Sure - I was trying to proffer some suggestions as to why Ray Kurzweil
suggested 25 years between attaining human level computational ability
and the singularity. I haven't read his book, just summaries - maybe
someone who has could enlightent us.

BTW - I still think we haven't cracked the problem of open-ended
creativity, which is essential for something like the singularity to
occur, but recent developments have lead me to believe it might be
achieved sooner rather than later. Ten years ago, I'd have said the
singularity wouldn't appear before 2070 (probably did say, though not
publicly). Now, I've brought that forward to 2050s


> 
> > It will also need to better the energy efficiency of human brains, and 
> it
> is still orders of magnitude away from that.
> 
> 
> Take a look at this video, it talks about Nvidia's new chip, with a data 
> center
> using it an AI system that had required 35 MW to run will only need 5 MW to do
> the same thing. 
> 
> Nvidia's HUGE AI Breakthrough is Bigger Than ChatGPT

That is a seven fold improvement, not quite one order of magnitude. My
understanding is that about 4-5 orders of magnitude are required
before machines can really take over the world. It will happen, but
on present exponential progress (classic Moore's law) that will take 2-3
decades.

Current AI systems like GPT-4 require the resources of a small town of
several thousand people for training.

GPT-4 is about 800 billion parameters IIUC. A human brain has over a
trillion synapses, so its certainly getting close.


> 
> By the way, I think mathematicians and software developers will be the first 
> to
> lose their jobs, perhaps they could be retrained as coal miners.   
>

I don't think they'll be the first :). ATM, GPT systems seem to have
an enormous propensity to make shit up, but less skill in making shit
up that is correct. ISTM the creative arts might be the area to lose
their jobs first.


> John K Clark    See what's on my new list at  Extropolis
> 8fi
> 
> 
> 
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email
> to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv15B0KuA4vUUid4YGex_H9uqwTnbKT9EU_64mqjNioXcw%40mail.gmail.com.

-- 


Dr Russell StandishPhone 0425 253119 (mobile)
Principal, High Performance Coders hpco...@hpcoders.com.au
  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/20230409000824.GB1379%40zen.


Re: Can Google Bard's Upgrade Outsmart GPT-4?

2023-04-08 Thread spudboy100 via Everything List
I just go by the dread word, "policy," and do things work or not, as in work 
well or not. The arti-womb is doable and is at hand and is an ethical step up 
for our species. It may even be profitable as well?
No imaginary teams of Reps is what I am concerned about. 
For the Reps they'd rather lose elections than compromise or change. Thus, they 
did in 2022, and thus, they will in 2024. It's more important to 'feel moral' 
than it is to win and then problem-solve, is my estimation. The same with your 
Democrats. 
I may be an ideological mutation in all this, passing my memes to all in order 
to mend the world? 


-Original Message-
From: John Clark 
To: spudboy...@aol.com
Cc: everything-list@googlegroups.com 
Sent: Fri, Apr 7, 2023 4:33 pm
Subject: Re: Can Google Bard's Upgrade Outsmart GPT-4?

On Thu, Apr 6, 2023 at 3:49 PM  wrote:


https://www.wired.com/story/ectogenesis-reproductive-health-abortion/
Like the Singularity approaching early, this will be a thing I am suspecting. 



You know Spud, to me your heart doesn't seem to be with this anti abortion 
stance, I suspect you have it because Trump's zombies think it's important  and 
you feel that if you want to retain your credentials as a certified right wing 
loon you need to take that position too. But I could be wrong, if so you might 
want to ask GPT4 a few questions, such as who's gonna pay the enormous cost of 
operating millions of synthetic wombs when, thanks to right wing Republicans, 
unlike every other  democracy in the world the US doesn't have universal 
healthcare?  And who is going to pay to feed, shelter and educate millions of 
unwanted children for 18 years? And what are you going to do in 18 years, and 
every year after that , when millions of unloved and undereducated illegal 
aliens (as their biological mothers would have every right to call them) try to 
enter the job market?  Maybe GPT4 is smart enough to answer these questions but 
I am not. 
 John K Clark    See what's on my new list at  Extropolis
it,

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv2%3Dmkx-Ckw-ci2e5HAwH8-CbuPyvkw5G-ZFbYZzhu4SoA%40mail.gmail.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/1954235402.400531.1680988405033%40mail.yahoo.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread spudboy100 via Everything List
Follow up from Fox of all sources! "Moral judgements"
https://www.foxnews.com/tech/ai-chatbot-chatgpt-influence-human-moral-judgments



-Original Message-
From: John Clark 
To: everything-list@googlegroups.com
Sent: Sat, Apr 8, 2023 3:11 pm
Subject: Re: It's too late to stop GPT4 now

On Sat, Apr 8, 2023 at 8:19 AM Russell Standish  wrote:


> Don't forget it requires a society of hundreds of millions of human
level intelligences to make a GPT-4. And it take a human level intelligence 
some 20 years in order to make
meaningful contributions to something like GPT-4.
Progress will therefore continue to be be exponential for some time to
come. Only when super human intelligence is able to design itself will
hyperbolic progress begin. 


Although certainly extremely helpful most areas of science require more than 
just a brilliant theoretician, they need experimental evidence, and so new 
knowledge in those fields will not grow at the same explosive rate as computer 
intelligence does; however there are two fields that do not require experiment 
evidence and so should grow as rapidly as intelligence does, mathematics and 
software development, including smart software they can write even smarter 
software. And there are mountains of data on physics and biology that already 
exist and they're almost certainly unknown gems hiding in there that nobody has 
spotted, but with new mathematical techniques and better software they could be 
found. 

> It will also need to better the energy efficiency of human brains, and it is 
> still orders of magnitude away from that.

Take a look at this video, it talks about Nvidia's new chip, with a data center 
using it an AI system that had required 35 MW to run will only need 5 MW to do 
the same thing.  
Nvidia's HUGE AI Breakthrough is Bigger Than ChatGPT

By the way, I think mathematicians and software developers will be the first to 
lose their jobs, perhaps they could be retrained as coal miners.   

John K Clark    See what's on my new list at  Extropolis
8fi

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv15B0KuA4vUUid4YGex_H9uqwTnbKT9EU_64mqjNioXcw%40mail.gmail.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/560583824.400411.1680987893391%40mail.yahoo.com.


Video- A physicist who gives a head-nod to FTL

2023-04-08 Thread spudboy100 via Everything List


https://youtu.be/9-jIplX6Wjw

Warp factor 3!
Interociters on maximum broadcast!

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/1113146945.400079.1680987658034%40mail.yahoo.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread John Clark
On Sat, Apr 8, 2023 at 8:19 AM Russell Standish 
wrote:


>
>
>
>
> *> Don't forget it requires a society of hundreds of millions of human
> level intelligences to make a GPT-4. And it take a human level intelligence
> some 20 years in order to make meaningful contributions to something like
> GPT-4.Progress will therefore continue to be be exponential for some time
> to come. Only when super human intelligence is able to design itself will
> hyperbolic progress begin. *
>

Although certainly extremely helpful most areas of science require more
than just a brilliant theoretician, they need experimental evidence, and so
new knowledge in those fields will not grow at the same explosive rate as
computer intelligence does; however there are two fields that do not
require experiment evidence and so should grow as rapidly as intelligence
does, mathematics and software development, including smart software they
can write even smarter software. And there are mountains of data on physics
and biology that already exist and they're almost certainly unknown gems
hiding in there that nobody has spotted, but with new mathematical
techniques and better software they could be found.

*> It will also need to better the energy efficiency of human brains, and
> it is still orders of magnitude away from that.*


Take a look at this video, it talks about Nvidia's new chip, with a data
center using it an AI system that had required 35 MW to run will only need
5 MW to do the same thing.

Nvidia's HUGE AI Breakthrough is Bigger Than ChatGPT


By the way, I think mathematicians and software developers will be the
first to lose their jobs, perhaps they could be retrained as coal miners.

John K ClarkSee what's on my new list at  Extropolis

8fi

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv15B0KuA4vUUid4YGex_H9uqwTnbKT9EU_64mqjNioXcw%40mail.gmail.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread Russell Standish
Don't forget it requires a society of hundreds of millions of human
level intelligences to make a GPT-4.

And it take a human level intelligence some 20 years in order to make
meaningful contributions to something like GPT-4.

Progress will therefore continue to be be exponential for some time to
come. Only when super human intelligence is able to design itself will
hyperbolic progress begin. It will also need to better the energy
efficiency of human brains, and it is still orders of magnitude away
from that.

In saying 25 years to singularity, I was simply taking Kurzweil's
timeline, and adding the 5 years he was out by.



On Sat, Apr 08, 2023 at 07:46:05AM -0400, John Clark wrote:
> 
> On Sat, Apr 8, 2023 at 7:31 AM Stathis Papaioannou  wrote:
> 
> 
> > Why such a long gap between gaining human level intelligence and the
> singularity?
> 
> 
> That is a very good question but I don't have a very good answer so I don't
> think there will be a long gap. Fasten your seatbelts, we're in for a bumpy
> ride. 
> 
> John K Clark    See what's on my new list at  Extropolis
> 
> jwx
> 
> 
> 
> 
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email
> to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv1BhQQ4visgvVPTZTH9ZqiMJYcabARAnY_g37c-dUOUkA%40mail.gmail.com.

-- 


Dr Russell StandishPhone 0425 253119 (mobile)
Principal, High Performance Coders hpco...@hpcoders.com.au
  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/20230408121750.GA1379%40zen.


Re: It's too late to stop GPT4 now

2023-04-08 Thread John Clark
On Sat, Apr 8, 2023 at 7:31 AM Stathis Papaioannou 
wrote:

*> Why such a long gap between gaining human level intelligence and the
> singularity?*


That is a very good question but I don't have a very good answer so I don't
think there will be a long gap. Fasten your seatbelts, we're in for a bumpy
ride.

John K ClarkSee what's on my new list at  Extropolis


jwx

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv1BhQQ4visgvVPTZTH9ZqiMJYcabARAnY_g37c-dUOUkA%40mail.gmail.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread John Clark
On Sat, Apr 8, 2023 at 2:39 AM Russell Standish 
wrote:


> *> I would predict that human level intelligence may be matched in
> 2025with GPT-5, only 5 years later than Ray Kurzweil's prediction,*


Actually Kurzweil predicted that "*computers will be routinely passing the
Turing test by 2029* ", so his prediction was too conservative because,
although it's not "routine" quite yet, I would argue that one computer
program passed the Turing Test one month ago and that by 2025 human level
AI will be ubiquitous. As for GPT-5, some say it's already operational but
OpenAI is checking it for safety and they will not be releasing it to the
world until late this year or early next. OpenAI only has 375 employees and
that number is small enough to keep a secret for a few months, after all we
now know that GPT-4 first became operational last August, although the
world didn't find out about it until March.

John K ClarkSee what's on my new list at  Extropolis

u9b



>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv2HJiRKha1F_-oFejxHFOuROBR80QFx_1dKVf8q5U3VzQ%40mail.gmail.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread Stathis Papaioannou
On Sat, 8 Apr 2023 at 16:39, Russell Standish  wrote:

> What struck me when watching this video is the uncanny similarity of
> this mechanism to the Steven Pinker's proposed "mind's big bang",
> which took place in human minds about 40,000 years ago.
>
> It all came down to using language for the disparate modules of the
> human brain to talk to each other, likened to individual chapels
> uniting to form a cathedral.
>
> I would predict that human level intelligence may be matched in 2025
> with GPT-5, only 5 years later than Ray Kurzweil's prediction, which
> might mean the singularity is on course for some time in the 2050s...


Why such a long gap between gaining human level intelligence and the
singularity?

> --
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAH%3D2ypUrqQiUJw5CjNOnYXfraKewQ3res-a3-btk%2BZYq0_6a4w%40mail.gmail.com.