If I can spawn a finite but unlimited number of parallel processes in "space", I can compute AIXItl, for example. So let's say the generating space is projected down into 3D space + time -- it is approximated by time, correct? In other words, once you admit "space" as a computation dimension, don't you beg the question?
On Thu, Nov 21, 2019 at 6:06 PM TimTyler <t...@tt1.org> wrote: > On 2019-11-21 11:46:AM, James Bowery wrote: > > The point of my conjecture is that there is a very good reason to > > select "the smallest executable archive of the data" as your > > information criterion over the other information criteria -- and it > > has to do with the weakness of "lossy compression" as model selection. > > That, along with a number of other entries in the list is a "space-only" > criterion. > > It seems reasonable that runtime duration,as well as program complexity is > a > > factor for most real-world data. As well as being generated by a small > system, > > observed data was probably generated in a limited time. Space-time metrics > > are clearly needed. I think we can reject any alleged superiority of any > > space-only metric. > > -- > __________ > |im |yler http://timtyler.org/ > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T0fc0d7591fcf61c5-M298bc50eafc85605b71024ec Delivery options: https://agi.topicbox.com/groups/agi/subscription