Re: It's too late to stop GPT4 now

2023-04-08 Thread Russell Standish
On Sat, Apr 08, 2023 at 03:11:47PM -0400, John Clark wrote:
> On Sat, Apr 8, 2023 at 8:19 AM Russell Standish  wrote:
> 
> 
> > Don't forget it requires a society of hundreds of millions of human
> level intelligences to make a GPT-4. And it take a human level 
> intelligence
> some 20 years in order to make
> meaningful contributions to something like GPT-4.
> Progress will therefore continue to be be exponential for some time to
> come. Only when super human intelligence is able to design itself will
> hyperbolic progress begin. 
> 
> 
> Although certainly extremely helpful most areas of science require more than
> just a brilliant theoretician, they need experimental evidence, and so new
> knowledge in those fields will not grow at the same explosive rate as computer
> intelligence does; however there are two fields that do not require experiment
> evidence and so should grow as rapidly as intelligence does, mathematics and
> software development, including smart software they can write even smarter
> software. And there are mountains of data on physics and biology that already
> exist and they're almost certainly unknown gems hiding in there that nobody 
> has
> spotted, but with new mathematical techniques and better software they could 
> be
> found.
>

Sure - I was trying to proffer some suggestions as to why Ray Kurzweil
suggested 25 years between attaining human level computational ability
and the singularity. I haven't read his book, just summaries - maybe
someone who has could enlightent us.

BTW - I still think we haven't cracked the problem of open-ended
creativity, which is essential for something like the singularity to
occur, but recent developments have lead me to believe it might be
achieved sooner rather than later. Ten years ago, I'd have said the
singularity wouldn't appear before 2070 (probably did say, though not
publicly). Now, I've brought that forward to 2050s


> 
> > It will also need to better the energy efficiency of human brains, and 
> it
> is still orders of magnitude away from that.
> 
> 
> Take a look at this video, it talks about Nvidia's new chip, with a data 
> center
> using it an AI system that had required 35 MW to run will only need 5 MW to do
> the same thing. 
> 
> Nvidia's HUGE AI Breakthrough is Bigger Than ChatGPT

That is a seven fold improvement, not quite one order of magnitude. My
understanding is that about 4-5 orders of magnitude are required
before machines can really take over the world. It will happen, but
on present exponential progress (classic Moore's law) that will take 2-3
decades.

Current AI systems like GPT-4 require the resources of a small town of
several thousand people for training.

GPT-4 is about 800 billion parameters IIUC. A human brain has over a
trillion synapses, so its certainly getting close.


> 
> By the way, I think mathematicians and software developers will be the first 
> to
> lose their jobs, perhaps they could be retrained as coal miners.   
>

I don't think they'll be the first :). ATM, GPT systems seem to have
an enormous propensity to make shit up, but less skill in making shit
up that is correct. ISTM the creative arts might be the area to lose
their jobs first.


> John K Clark    See what's on my new list at  Extropolis
> 8fi
> 
> 
> 
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email
> to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv15B0KuA4vUUid4YGex_H9uqwTnbKT9EU_64mqjNioXcw%40mail.gmail.com.

-- 


Dr Russell StandishPhone 0425 253119 (mobile)
Principal, High Performance Coders hpco...@hpcoders.com.au
  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/20230409000824.GB1379%40zen.


Re: It's too late to stop GPT4 now

2023-04-08 Thread spudboy100 via Everything List
Follow up from Fox of all sources! "Moral judgements"
https://www.foxnews.com/tech/ai-chatbot-chatgpt-influence-human-moral-judgments



-Original Message-
From: John Clark 
To: everything-list@googlegroups.com
Sent: Sat, Apr 8, 2023 3:11 pm
Subject: Re: It's too late to stop GPT4 now

On Sat, Apr 8, 2023 at 8:19 AM Russell Standish  wrote:


> Don't forget it requires a society of hundreds of millions of human
level intelligences to make a GPT-4. And it take a human level intelligence 
some 20 years in order to make
meaningful contributions to something like GPT-4.
Progress will therefore continue to be be exponential for some time to
come. Only when super human intelligence is able to design itself will
hyperbolic progress begin. 


Although certainly extremely helpful most areas of science require more than 
just a brilliant theoretician, they need experimental evidence, and so new 
knowledge in those fields will not grow at the same explosive rate as computer 
intelligence does; however there are two fields that do not require experiment 
evidence and so should grow as rapidly as intelligence does, mathematics and 
software development, including smart software they can write even smarter 
software. And there are mountains of data on physics and biology that already 
exist and they're almost certainly unknown gems hiding in there that nobody has 
spotted, but with new mathematical techniques and better software they could be 
found. 

> It will also need to better the energy efficiency of human brains, and it is 
> still orders of magnitude away from that.

Take a look at this video, it talks about Nvidia's new chip, with a data center 
using it an AI system that had required 35 MW to run will only need 5 MW to do 
the same thing.  
Nvidia's HUGE AI Breakthrough is Bigger Than ChatGPT

By the way, I think mathematicians and software developers will be the first to 
lose their jobs, perhaps they could be retrained as coal miners.   

John K Clark    See what's on my new list at  Extropolis
8fi

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv15B0KuA4vUUid4YGex_H9uqwTnbKT9EU_64mqjNioXcw%40mail.gmail.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/560583824.400411.1680987893391%40mail.yahoo.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread John Clark
On Sat, Apr 8, 2023 at 8:19 AM Russell Standish 
wrote:


>
>
>
>
> *> Don't forget it requires a society of hundreds of millions of human
> level intelligences to make a GPT-4. And it take a human level intelligence
> some 20 years in order to make meaningful contributions to something like
> GPT-4.Progress will therefore continue to be be exponential for some time
> to come. Only when super human intelligence is able to design itself will
> hyperbolic progress begin. *
>

Although certainly extremely helpful most areas of science require more
than just a brilliant theoretician, they need experimental evidence, and so
new knowledge in those fields will not grow at the same explosive rate as
computer intelligence does; however there are two fields that do not
require experiment evidence and so should grow as rapidly as intelligence
does, mathematics and software development, including smart software they
can write even smarter software. And there are mountains of data on physics
and biology that already exist and they're almost certainly unknown gems
hiding in there that nobody has spotted, but with new mathematical
techniques and better software they could be found.

*> It will also need to better the energy efficiency of human brains, and
> it is still orders of magnitude away from that.*


Take a look at this video, it talks about Nvidia's new chip, with a data
center using it an AI system that had required 35 MW to run will only need
5 MW to do the same thing.

Nvidia's HUGE AI Breakthrough is Bigger Than ChatGPT


By the way, I think mathematicians and software developers will be the
first to lose their jobs, perhaps they could be retrained as coal miners.

John K ClarkSee what's on my new list at  Extropolis

8fi

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv15B0KuA4vUUid4YGex_H9uqwTnbKT9EU_64mqjNioXcw%40mail.gmail.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread Russell Standish
Don't forget it requires a society of hundreds of millions of human
level intelligences to make a GPT-4.

And it take a human level intelligence some 20 years in order to make
meaningful contributions to something like GPT-4.

Progress will therefore continue to be be exponential for some time to
come. Only when super human intelligence is able to design itself will
hyperbolic progress begin. It will also need to better the energy
efficiency of human brains, and it is still orders of magnitude away
from that.

In saying 25 years to singularity, I was simply taking Kurzweil's
timeline, and adding the 5 years he was out by.



On Sat, Apr 08, 2023 at 07:46:05AM -0400, John Clark wrote:
> 
> On Sat, Apr 8, 2023 at 7:31 AM Stathis Papaioannou  wrote:
> 
> 
> > Why such a long gap between gaining human level intelligence and the
> singularity?
> 
> 
> That is a very good question but I don't have a very good answer so I don't
> think there will be a long gap. Fasten your seatbelts, we're in for a bumpy
> ride. 
> 
> John K Clark    See what's on my new list at  Extropolis
> 
> jwx
> 
> 
> 
> 
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email
> to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv1BhQQ4visgvVPTZTH9ZqiMJYcabARAnY_g37c-dUOUkA%40mail.gmail.com.

-- 


Dr Russell StandishPhone 0425 253119 (mobile)
Principal, High Performance Coders hpco...@hpcoders.com.au
  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/20230408121750.GA1379%40zen.


Re: It's too late to stop GPT4 now

2023-04-08 Thread John Clark
On Sat, Apr 8, 2023 at 7:31 AM Stathis Papaioannou 
wrote:

*> Why such a long gap between gaining human level intelligence and the
> singularity?*


That is a very good question but I don't have a very good answer so I don't
think there will be a long gap. Fasten your seatbelts, we're in for a bumpy
ride.

John K ClarkSee what's on my new list at  Extropolis


jwx

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv1BhQQ4visgvVPTZTH9ZqiMJYcabARAnY_g37c-dUOUkA%40mail.gmail.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread John Clark
On Sat, Apr 8, 2023 at 2:39 AM Russell Standish 
wrote:


> *> I would predict that human level intelligence may be matched in
> 2025with GPT-5, only 5 years later than Ray Kurzweil's prediction,*


Actually Kurzweil predicted that "*computers will be routinely passing the
Turing test by 2029* ", so his prediction was too conservative because,
although it's not "routine" quite yet, I would argue that one computer
program passed the Turing Test one month ago and that by 2025 human level
AI will be ubiquitous. As for GPT-5, some say it's already operational but
OpenAI is checking it for safety and they will not be releasing it to the
world until late this year or early next. OpenAI only has 375 employees and
that number is small enough to keep a secret for a few months, after all we
now know that GPT-4 first became operational last August, although the
world didn't find out about it until March.

John K ClarkSee what's on my new list at  Extropolis

u9b



>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv2HJiRKha1F_-oFejxHFOuROBR80QFx_1dKVf8q5U3VzQ%40mail.gmail.com.


Re: It's too late to stop GPT4 now

2023-04-08 Thread Stathis Papaioannou
On Sat, 8 Apr 2023 at 16:39, Russell Standish  wrote:

> What struck me when watching this video is the uncanny similarity of
> this mechanism to the Steven Pinker's proposed "mind's big bang",
> which took place in human minds about 40,000 years ago.
>
> It all came down to using language for the disparate modules of the
> human brain to talk to each other, likened to individual chapels
> uniting to form a cathedral.
>
> I would predict that human level intelligence may be matched in 2025
> with GPT-5, only 5 years later than Ray Kurzweil's prediction, which
> might mean the singularity is on course for some time in the 2050s...


Why such a long gap between gaining human level intelligence and the
singularity?

> --
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAH%3D2ypUrqQiUJw5CjNOnYXfraKewQ3res-a3-btk%2BZYq0_6a4w%40mail.gmail.com.


Re: It's too late to stop GPT4 now

2023-04-07 Thread Russell Standish
What struck me when watching this video is the uncanny similarity of
this mechanism to the Steven Pinker's proposed "mind's big bang",
which took place in human minds about 40,000 years ago.

It all came down to using language for the disparate modules of the
human brain to talk to each other, likened to individual chapels
uniting to form a cathedral.

I would predict that human level intelligence may be matched in 2025
with GPT-5, only 5 years later than Ray Kurzweil's prediction, which
might mean the singularity is on course for some time in the 2050s...

Cheers

On Sun, Apr 02, 2023 at 03:35:03PM -0400, John Clark wrote:
> This video is a summary of several technical papers that have come out in the
> last 72 hours, apparently GPT4 can now improve itself without human help by
> self-reflecting on its errors and can even design better hardware for itself. 
> 
> GPT 4 Can Improve Itself by self reflection 
> 
> John K Clark    See what's on my new list at  Extropolis
> 3zi
> 
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email
> to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/
> everything-list/
> CAJPayv1Zdxw4fhV7Vq%3DWHDL5SOUNbnyCNgFmjuEo8%3DqOAC%2Busg%40mail.gmail.com.

-- 


Dr Russell StandishPhone 0425 253119 (mobile)
Principal, High Performance Coders hpco...@hpcoders.com.au
  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/20230408063942.GA6671%40zen.


Re: It's too late to stop GPT4 now

2023-04-02 Thread spudboy100 via Everything List
Maybe someday we humans can do the same? 


-Original Message-
From: John Clark 
To: 'Brent Meeker' via Everything List 
Sent: Sun, Apr 2, 2023 3:35 pm
Subject: It's too late to stop GPT4 now

This video is a summary of several technical papers that have come out in the 
last 72 hours, apparently GPT4 can now improve itself without human help by 
self-reflecting on its errors and can even design better hardware for itself. 
GPT 4 Can Improve Itself by self reflection 
John K Clark    See what's on my new list at  Extropolis
3zi-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv1Zdxw4fhV7Vq%3DWHDL5SOUNbnyCNgFmjuEo8%3DqOAC%2Busg%40mail.gmail.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/119213892.2325970.1680469555398%40mail.yahoo.com.


Re: It's too late to stop GPT4 now

2023-04-02 Thread Jason Resch
"Let an ultraintelligent machine be defined as a machine that can far
surpass all the intellectual activities of any man however clever. Since
the design of machines is one of these intellectual activities, an
ultraintelligent machine could design even better machines; there would
then unquestionably be an 'intelligence explosion,' and the intelligence of
man would be left far behind... Thus the first ultraintelligent machine is
the last invention that man need ever make, provided that the machine is
docile enough to tell us how to keep it under control. It is curious that
this point is made so seldom outside of science fiction. It is sometimes
worthwhile to take science fiction seriously."
-- I.J. Good

On Sun, Apr 2, 2023, 3:35 PM John Clark  wrote:

> This video is a summary of several technical papers that have come out in
> the last 72 hours, apparently GPT4 can now improve itself without human
> help by self-reflecting on its errors and can even design better hardware
> for itself.
>
> GPT 4 Can Improve Itself by self reflection
> 
>
> John K ClarkSee what's on my new list at  Extropolis
> 
> 3zi
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/everything-list/CAJPayv1Zdxw4fhV7Vq%3DWHDL5SOUNbnyCNgFmjuEo8%3DqOAC%2Busg%40mail.gmail.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CA%2BBCJUhkHk6Apmq%3D%2BzUYMpX1tNjc3TzYUwyJW1x_WthQy0p7BA%40mail.gmail.com.