My paper on the cost of AI is published as a book chapter in "Philosophy of
Mind: Contemporary Perspectives" in 2017. My main contribution is using
data compression to estimate the information content of our DNA in
equivalent lines of code. It is about 300 million lines. It represents
about half of a person's knowledge. The other half is learned and stored in
long term memory.

Each half is on the order of 10^9 bits. This does not seem like a lot. It
is probably why Turing predicted in 1950 that a computer with 10^9 bits of
memory and no faster than current technology (vacuum tubes and relays)
would win the imitation game (pass the Turing test) in 2000. He probably
assumed a computer could be educated as a child, so 10^9 bits of training
text at 10 Hz over a decade should be sufficient.

But automating human labor also requires vision, complex robotics, and
modeling human behavior including humor, art, music, food, and emotions.
That is AGI. Training vision alone requires a decade of high resolution
video, 137M pixels per eye at 10 Hz. That's over 10^18 bits. We know from
experiments that could only be done recently on supercomputers that neural
networks give the best results over a wide range of AI problems. A human
brain sized neural network with 86 billion neurons and 600 trillion
synapses at 10 Hz requires 12 petaflops and a petabyte of RAM.

At the current cost of 1MW of electricity per petaflop, automating 5
billion workers will require 60,000 terawatts of power. Current global
energy production is 15 TW. I think the power requirements can be reduced
through specialization and neuromorphic hardware, but it will still be well
above the 0.7 TW it takes to feed the world's population unless we find an
alternative to computing with transistors.

300M lines of code will cost $30 billion. But really, this is an
insignificant fraction of the total cost. We shouldn't bother with
alternatives like evolution or other reinforcement learning. The problem
with evolution is each life or death decision only transmits one bit. The
biosphere has 10^37 bits of DNA. Human evolution took 10^48 DNA copy
operations and 10^50 transcription operations over 10^17 seconds (3 billion
years) assuming a cell replication rate of 10^-6 Hz (11 days). Keep in mind
these operations use only a billionth as much energy as in transistors or
10,000th as much as in neurons.

The next decade of human labor will cost $1 quadrillion. Keep in mind what
you want to achieve and can achieve. There are millions of job
specializations. Automating just one is hard. But if millions of people can
do this, then we have AGI.

On Fri, Aug 2, 2019, 3:56 PM Secretary of Trades <costi.dumitre...@gmx.com>
wrote:

> Matt do another paper and find a way to refer work that goes without
> publishing papers and books, such as the Senator's.
>
>
> On 02.08.2019 05:53, Matt Mahoney wrote:
> > The obvious application of AGI is automating $80 trillion per year
> > that we have to pay people for work that machines aren't smart enough
> > to do. That means solving hard problems in language, vision, robotics,
> > art, and modeling human behavior. I listed the requirements in more
> > detail in my paper. The solution is going to require decades of global
> > effort. The best that individuals can do is make small steps towards a
> > solution. http://mattmahoney.net/costofai.pdf
> >
>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf27122c71ce3b240-Me822563f4e62737fc580cec7
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to