[agi] Noise and Confirmation Bias

2019-10-06 Thread James Bowery
Lossy compression is confirmation bias.  One man's noise is another man's cyphertext. I finally got this through to the guy who Hinton, Werbos, McClelland and Rumelhart credit with financing the revival of connectionism in the 1980s. He's totally on board with lossless compression as the model

Re: [agi] The Job market.

2019-10-06 Thread James Bowery
On Sun, Oct 6, 2019 at 11:29 AM James Bowery wrote: > ... > If one takes Randell Mills's "Grand Unified Theory of Classical Physics" > seriously (and I do) he has, even *without* the above ansatz, has > calculated an amazing array of physical properties >

Re: [agi] The Job market.

2019-10-06 Thread Ben Goertzel
*** But suppose you enumerated all possible universes and ran the n'th one for n steps. Ours would be n ~ 10^120 steps, or about 400 bits. *** Perhaps so, but on a current standard computer it would take more than 400 bits to code the universe-simulator ;) So the 400 bits figure presumes the comp

Re: [agi] The Job market.

2019-10-06 Thread TimTyler
On 2019-10-06 06:05:AM, Matt Mahoney wrote: > On Sun, Oct 6, 2019, 2:59 AM Ben Goertzel > wrote or quoted Matt as writing: > It probably takes a few hundred bits to describe the laws of physics. > > Hmm, that seems very few, just taking a look at the Standard

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
Yes I read all of it Matt 2 months ago, it was thrilling to read. @James, I did intend them both were combined. Above is 3 visualizations, each with 2 sticks of a certain length. My point was the size of the data you start with is the same if either stick is the original size...the actual compre

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread Matt Mahoney
I assume you are talking about the Hutter prize for compressing a 100 MB text file. http://prize.hutter1.net/ I suggest reading http://mattmahoney.net/dc/dce.html to understand why your method won't work. Sure, you can find the file embedded in the digits of pi or some other big number, but you ne

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread James Bowery
See the rules for Matt's large text compression benchmark. You *always* add the length of the decompression program to the length of the compressed data in order to approximate the Kolmogorov Complexity of the data. Only by bringing the errors of the model into the same units (bits) as the model

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
I was working on the compression prize last month ago or so for many reasons. I failed every time but I learned a lot more about trees, bin files, etc. Every time I'd come up with a unique solution that's innovative but however was not verified to do the job well enough One idea was that 9^9^9

Re: [agi] The Job market.

2019-10-06 Thread James Bowery
On Sun, Oct 6, 2019 at 5:07 AM Matt Mahoney wrote: > ... > Yudkowsky and Wolfram seem to think so. I don't know their exact > reasoning, but it probably takes 400 bits to describe the 40 or so free > parameters in string theory > That doesn't make sense unless those parameters each require about

Re: [agi] Re: Physical temporal pattern loops

2019-10-06 Thread keghnfeem
 From the beginning of time to the end.  This is not a temporal pattern loop at all.  But the moon goes around the planet many times.   In the beginning of Earth there was no life. So no chance of anything comprehending N versus NP. P vs. NP and the Computational Complexity Zoo: https://www.yo

Re: [agi] The Job market.

2019-10-06 Thread James Bowery
I should also mention that directed cyclic NOR networks have been studied as generators of the dimensionless scaling constants by the Alternative Natural Philosophy Association . In other words, there is reason to belie

Re: [agi] The Job market.

2019-10-06 Thread Matt Mahoney
On Sun, Oct 6, 2019, 4:47 AM John Rose wrote: > > Pi doesn't exist, only expressions and approximations of it and you gave > an example. You must expend energy and time to generate a better > approximation, IOW add power. > Do numbers exist? It's a philosophical question. Philosophy is arguing a

Re: [agi] The Job market.

2019-10-06 Thread James Bowery
Directed Cyclic NOR (or NAND) networks suffice as natural Turing machines. On Sun, Oct 6, 2019 at 2:00 AM Ben Goertzel wrote: > Matt, > > > It probably takes a few hundred bits to describe the laws of physics. > > Hmm, that seems very few, just taking a look at the Standard Model and > General R

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread James Bowery
It's the hard problem of intelligence because it is incomputable and because once you've solved it, you have optimal prediction. What's left after compression is sequential decision theory which, while hard, is at least computable. On Sun, Oct 6, 2019 at 5:33 AM Stefan Reich via AGI wrote: > Wh

Re: [agi] The Job market.

2019-10-06 Thread korrelan
The perceived complexity of any rendering/ reality is relative to the concious perception/ resolution of the observer... hence, your discussion is moot... unless you can provide a standard observer resolution constant? :) -- Artificial General Intelligence

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
I recognize an error, I did every word. Here: The The sat duck The sat sat The sat down The sat. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T2d0576044f01b0b1-M6e28a4ab2abed57ae0ef1f48 Delivery options: https:

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread immortal . discoveries
Because Unsupervised Learning is the hard part and the most important part? SL and RL are the cherry on the cake. I agree however, there's more to AGI than just a compressed network. For example the task of switching between summarization, translation, entailment, and segmentation tasks, requir

Re: [agi] MindForth Programming Journal 2019-10-05

2019-10-06 Thread Stefan Reich via AGI
Why is compression "the hard problem of intelligence"? We have compression. I'd say the hard problem of intelligence is making an AI that builds a boat and sails in it. We do not have that yet. On Sun, 6 Oct 2019 at 06:37, James Bowery wrote: > Chuck chose my question the first to answer on Slas

Re: [agi] The Job market.

2019-10-06 Thread Matt Mahoney
On Sun, Oct 6, 2019, 2:59 AM Ben Goertzel wrote: > Matt, > > > It probably takes a few hundred bits to describe the laws of physics. > > Hmm, that seems very few, just taking a look at the Standard Model and > General Relativity right now... > Yudkowsky and Wolfram seem to think so. I don't know

Re: [agi] The Job market.

2019-10-06 Thread John Rose
On Saturday, October 05, 2019, at 8:01 PM, Matt Mahoney wrote: > The complexity of an object is the fewest number of symbols needed to > describe it in some language. It has nothing to do with computation time, > energy, or consciousness. It is only the simplicity of a theory that > determines i

Re: [agi] The Job market.

2019-10-06 Thread Ben Goertzel
Matt, > It probably takes a few hundred bits to describe the laws of physics. Hmm, that seems very few, just taking a look at the Standard Model and General Relativity right now... What sort of machine are you assuming is interpreting these bits? If it's some sort of standard Turing machine wit