> Deep learning is locked in the paradigm of learning as much as possible. Like 
> studying for a test by learning all the answers, rather than understanding 
> principles which allow you to work out your own answers. That it only learns 
> patterns, and does not have a principle by which new patterns can be created, 
> is why we are trapped needing ever bigger data sets, eternally getting 
> better, but never becoming good enough. More and more agreement, more and 
> more dependency, but never all agreement or all dependency. However much data 
> you learn, it will never enough, because you can never learn the novelty 
> which comes from a principle by which new patterns (let alone meaning) can be 
> created.
>
> That's what this work is lacking.
>
> -Rob
>

We agree there for sure...

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T581199cf280badd7-M8987efb0c0ba178274ea623c
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to