[agi] Re: reading "The Hidden Pattern"

2020-06-29 Thread immortal . discoveries
food=money=jobevolve & exploit domain  -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T90840a08c888a2d2-Mb5c195726d1bd1a1bbbff561 Delivery options: https://agi.topicbox.com/groups/agi/subscription

[agi] reading "The Hidden Pattern"

2020-06-29 Thread immortal . discoveries
Ben's 2006 book is an ok one. Very long though and mostly all over the place and nothing much new (for me), mine I'm making will be very short! I'm actually done now, checked enough of it... Ben says what I say but in slightly different words: "the goal of achieving personal happiness is so imp

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread immortal . discoveries
Seeing that, Matt, where is the razor definition in my post above??? Work with that post. What I mean is the brain sees observations ex.: the cat ate food the cat was on a porch all night with candles the woman The brain will only store the word "the" once but with a strength of 3 if seen 3 ti

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread immortal . discoveries
Representations, and short representations, all make a brain smaller in size! Hence better prediction accuracy. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T37756381803ac879-M1f9c2ca1afba4a3dfb27c4c2 Delivery

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread immortal . discoveries
Uh, Matt, .So you're saying the "simple but not too simple" isn't exactly just a way to improve AGI prediction, but much more deeper? Like below?: 1) Short program/brain = 2) better prediction: Short prediction = 3) less error: Short error = 4) big paycheck and big happy Amazing. Note th

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread John Rose
One cannot deny that the concept of soul exists. That is the only soul that I have ever referred to in any related discussion. One may take the position that concepts don't exist which would be a rather interesting debate. As far as some real physical soul that is what Minsky thought I was refer

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread WriterOfMinds
On Monday, June 29, 2020, at 9:13 AM, Matt Mahoney wrote: > Surely anyone who believes that AGI is possible wouldn't also believe in souls or heaven or ghosts??? Your brain is a computer, right? Belief in souls and whatnot is fully compatible with the belief that AGI is possible, if one avoids ma

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread John Rose
It's waaay more than just a witty saying!  Have you read Occam or just that one catchphrase? Science isn't based on catchphrases. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T37756381803ac879-M799c7a396111771

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread Matt Mahoney
On Mon, Jun 29, 2020, 4:24 PM wrote: > But isn't Occam's Razor just basically saying "simple but not too simple"? > And isn't it razorring off search space to look through when looking for a > solution? > No. A razor is a witty saying. Occam said essentially that simple explanations are better.

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread John Rose
I believe human behavior is estimable. You believe human behavior is computable IOW has a K-complexity. What's hiding in the difference there? consciousness. In humans then belief is consciousness. Makes some sense I guess but I think consciousness is nondeterministic, you think it's determinist

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread Matt Mahoney
On Mon, Jun 29, 2020, 5:15 PM John Rose wrote: > On Monday, June 29, 2020, at 11:13 AM, Matt Mahoney wrote: > > Surely anyone who believes that AGI is possible wouldn't also believe in > souls or heaven or ghosts??? Your brain is a computer, right? > > > Matt, do you believe the K-complexity of t

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread John Rose
Wow this is really amazing. When you read Occam's writings from the 1300's, and are familiar with the Catholic belief system you can see the direct correlations between deity, spirituality (you know, ghosts and souls 'n stuff), and all of modern mathematics, AIT, logic, science, etc. WOW

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread John Rose
Tear down that statue! (Occam that is) Just figured I'd try to join the zeitgeist... though I do like Occam (Ockham ?) being from a Catholic upbringing :) -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T37756381

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread John Rose
On Monday, June 29, 2020, at 11:13 AM, Matt Mahoney wrote: > Surely anyone who believes that AGI is possible wouldn't also believe in souls or heaven or ghosts??? Your brain is a computer, right? Matt, do you believe the K-complexity of the Earth exists?  I don't think it does but perhaps you've

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread immortal . discoveries
But isn't Occam's Razor just basically saying "simple but not too simple"? And isn't it razorring off search space to look through when looking for a solution? -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T3775

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread Matt Mahoney
On Mon, Jun 29, 2020, 4:01 PM wrote: > Is my use of the word Razor correct? > No. See https://en.m.wikipedia.org/wiki/Solomonoff%27s_theory_of_inductive_inference -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/ag

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread immortal . discoveries
Is my use of the word Razor correct? Occam's Razor suggests "simple but not too simple", and I'm saying there is many Razors, like updating axon strength (statistics frequency), blending predictions, learning word embeds (cat=dog), recency, pleasure/pain words, etc etc all improve prediction by

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread James Bowery
As Keith Henson once told me, "If Randell Mills's SunCell turns out to work, it will be the system programmers messing with us." On Mon, Jun 29, 2020 at 10:14 AM Matt Mahoney wrote: > Surely anyone who believes that AGI is possible wouldn't also believe > in souls or heaven or ghosts??? Your bra

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread Matt Mahoney
Occam's Razor is true because for all possible probability distributions over the infinite set of possible theories described by strings, each theory can only be more likely than a finite set of longer theories. This is true in any language used to describe the theories. By "theory" I mean a descr

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread immortal . discoveries
But Matt, if we use a language that is easiest to compute in our observed universe, and penalize larger systems, then we are really just leveraging physics and a penalization. We already know this in the original Occam's Razor: Leverage physics and make the algorithm as small as possible (but no

Re: [agi] Re: Boogaloo news.

2020-06-29 Thread immortal . discoveries
Good eye! 1% of Covid-19 victims are serious/critical. But only maybe 0.5% actually die IF get it. It's simple, if you're old, stay in doors. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T84b882684c79db5c-M77d68

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread immortal . discoveries
I second that. A computer can run anything possible not only in our universe but other physics as well! It can run 3D "situations", 2D, anything. It can even run quantum mechanics. A computer is universal, it can run any "machine" that we have, could have, use, or care about! All things, no matt

Re: [agi] Re: Minsky's Physics Envy

2020-06-29 Thread Matt Mahoney
Surely anyone who believes that AGI is possible wouldn't also believe in souls or heaven or ghosts??? Your brain is a computer, right? On Sun, Jun 28, 2020 at 9:46 AM John Rose wrote: > > The only interaction I ever had with Minksy was regarding the existence of > one's soul. My position went al

Re: [agi] Re: Goertzel's "Grounding Occam's Razor in a Formal Theory of Simplicity"

2020-06-29 Thread Matt Mahoney
The problem with Occam's Razor or algorithmic information theory is that simplicity is language dependent. Any object can be described using one bit if the language is complex enough. We should prefer simple languages to avoid this problem, but defining simple languages just leads to a circular def

Re: [agi] Re: Boogaloo news.

2020-06-29 Thread Ian
steve: no, the 1% is actually related to hospitalization and risk of death, afaik (hi, im a long-time lurker on this list) my friend who is immuno-compromised suffered from covid-19 in february and lost about 30% of his already-reduced (emphyzema, lifelong smoker in his 50s) but survived and was

[agi] Re: Boogaloo news.

2020-06-29 Thread stefan.reich.maker.of.eye via AGI
So that means, with the real (non-tested) number of cases being probably at least 10 or maybe 100 times higher, the virus is actually dangerous only to .01% to .1% of all humans? -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.