Matt,

Yours is a remarkable posting - with SO much crammed into just two
paragraphs. Several disciplines would benefit if you were to write this in
more bite-sized pieces spread over several pages, with explanations mixed
in. Or, perhaps, I have simply missed a VERY important article?

Steve

On 10:29AM, Fri, Jun 15, 2018 Matt Mahoney via AGI <agi@agi.topicbox.com>
wrote:

> On Thu, Jun 14, 2018 at 10:40 PM Steve Richfield via AGI
> <agi@agi.topicbox.com> wrote:
> >
> > In the space of real world "problems", I suspect the distribution of
> difficulty follows the Zipf function, like pretty much everything else does.
> 
> A Zipf distribution is a power law distribution. The reason that power
> law distributions are so common over different domains (for example,
> wealth distribution or population of cities) is the same reason that
> Gaussian normal distributions are common. When you add a large set of
> small random variables, the result is Gaussian by the central limit
> theorem. When you multiply instead of add, you get a power law
> distribution, which is the exponential of a Gaussian. It happens
> whenever small random variations are in proportion to the magnitude of
> the variable.
> 
> So yes, the distribution of problem difficulty over broad domains is
> Zipf or power law. It is why intelligence (as measured by problem
> solving ability) is proportional to the log of computing power. The
> value of intelligent systems grows linearly while their power grows
> exponentially by Moore's law.
> 
> --
> -- Matt Mahoney, mattmahone...@gmail.com

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5ada390c367596a4-Mc9b631b2083c956f6cd6d8bf
Delivery options: https://agi.topicbox.com/groups

Reply via email to