On Monday, January 27, 2020, at 5:02 PM, James Bowery wrote:
> Unfortunately, measures inferior to self-extracting archive size, such as 
> "perplexity" or *worse* are now dominating SOTA publications.
I was just thinking this 4 days ago. Perhaps I read it somewhere directly.

On Monday, January 27, 2020, at 4:03 PM, Matt Mahoney wrote:
> The first main result of my 12 years of testing 1000+ versions of 200 
> compressors is that compression (as a measure of prediction accuracy or 
> intelligence) increases with the log of computing time and the log of memory 
> (and probably the log of code complexity, which I didn't measure). The best 
> way to establish this relationship is to test over as wide a range as 
> possible by removing time and hardware restrictions. The top ranked program 
> (cmix) requires 32 GB of RAM and takes a week, which is about a million times 
> more time and memory than the fastest programs. But it is still a billion 
> times faster and uses 100,000 times less memory than a human brain sized 
> neural network.
> 
Yes intelligence/evolution grows exponentially more faster (hence exponentially 
more powerful) the more data, compute, and arms (ex. nanobots) you have. It has 
better predictions and eats what it regurgitates and can recursively settle 
into the future faster than using poor-but-fast answers. So if you max your 
computer RAM/tolerable time wait with the simplest idea, you get better 
predictions/compressableness. Of course too precision predictions can be too 
slow and too big RAM to even use or you may need only a quick make-do solution 
for some small problem. So sometimes you can go deeper but don't, and sometimes 
you know the answer, and sometimes you need to do some thinking, sometimes 
years of thinking. But it's better to stay in bounds even when you don't have 
the answer. First the AGI identifies the best question for cost, then 
recursively takes cost effective baby steps evolving better 
answers(questions)/related knowledge in that domain.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T65747f0622d5047f-Ma8adbfe7b685bc644ebcdb4b
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to