[agi] Re: I tried Google Ads

2020-01-14 Thread John Rose
Yeah where the code at bruh? Gotta do 3D: https://arxiv.org/abs/1902.04615 -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Taaacb2c58a94e597-Mea808c56a65c268d88eb04c3 Delivery options: https://agi.topicbox.com/gro

[agi] Re: Real AGI Brain

2020-01-14 Thread immortal . discoveries
I have achieved compression of the wiki8 100MB into 50MB so far. Woho. Record is 15MB. Gets hard near the end though basically. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M11a37ef14e5dc7ae26

[agi] Re: Real AGI Brain

2020-01-14 Thread immortal . discoveries
Got big plans for next stage. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M7c3c69ab8c5b7574e1bb6b0a Delivery options: https://agi.topicbox.com/groups/agi/subscription

[agi] Re: Real AGI Brain

2020-01-14 Thread immortal . discoveries
Currently it's a letter predictor/generator. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-Ma2c4fc21d81ccf7face24156 Delivery options: https://agi.topicbox.com/groups/agi/subscription

Re: [agi] Parallel (Space Time) Kolmogorov Complexity?

2020-01-14 Thread James Bowery
Here's a simple modification to The Hutter Prize and the Large Text Compression Benchmark to illustrate my point: Split the Wikipedia corpus into separate files, one per Wikipedia article. An entry qualifies only if the set of check

Re: [agi] Re: Real AGI Brain

2020-01-14 Thread Matt Mahoney
On Tue, Jan 14, 2020, 7:02 AM wrote: > I have achieved compression of the wiki8 100MB into 50MB so far. Woho. > Record is 15MB. Gets hard near the end though basically. > That's how I got started in data compression. It took about 4 years for my code to start beating existing compressors and sta

Re: [agi] Parallel (Space Time) Kolmogorov Complexity?

2020-01-14 Thread John Rose
Isn't that technically near-lossless? Leaving some tiny wiggle room? -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tc33b8ed7189d2a18-M015a0fcc662cec4713a2c2e4 Delivery options: https://agi.topicbox.com/groups/agi

Re: [agi] Parallel (Space Time) Kolmogorov Complexity?

2020-01-14 Thread James Bowery
It is a respecification of the information to expose the spatial dimension of the corpus, while leaving the time dimension in tact. On Tue, Jan 14, 2020 at 1:51 PM John Rose wrote: > Isn't that technically near-lossless? Leaving some tiny wiggle room? > *Artificial General Intelligence List

[agi] Philosophy Phriday: Fodor's Language of Thought

2020-01-14 Thread keghnfeem
Philosophy Phriday: Fodor's Language of Thought:  https://www.youtube.com/watch?v=RmmZolJ8L6s  And now the poachers.  Concepts and Compositionality: In Search of the Brain’s Language of Thought:  http://static1.squarespace.com/static/54763f79e4b0c4e55ffb000c/t/5e16c95d81ab92511eb96c99/157855164