The no free lunch theorem says that all predictors are equal because all
bit sequences are equally likely. It would say that data compression is not
possible. It assumes a uniform distribution over unknown data. With this
assumption you can prove NFL.

This obviously conflicts with Occam's Razor, which is the basis of all
science and machine learning. See
https://en.m.wikipedia.org/wiki/No_free_lunch_theorem for discussion.

Why does Occam's Razor exist? Because you can't have a uniform distribution
over an infinite set. All possible distributions favor short strings over
long, or small integers over large. For any string, you have an infinite
set of longer and less likely strings and a finite set of the other 3
possibilities.

On Mon, Feb 3, 2020, 12:56 PM <immortal.discover...@gmail.com> wrote:

> So all in all, nothing is 'better' or as 'useful', nothing is ugly or
> pretty in reality, but what sorts better and emerges patterns is better and
> chosen as physics of this universe. So yes more patterny-systems survival a
> lot longer lifes. Which is what we, when compare 2 machine systems,
> consider one to be Better or Good. Both are just machines, but the patterny
> one overtakes.
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/T353f2000d499d93b-M5f28ce0c27b4d8184c7969ad>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T353f2000d499d93b-Mec4effd721056a69676d102c
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to