food=money=jobevolve & exploit domain
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T90840a08c888a2d2-Mb5c195726d1bd1a1bbbff561
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Ben's 2006 book is an ok one. Very long though and mostly all over the place
and nothing much new (for me), mine I'm making will be very short! I'm actually
done now, checked enough of it...
Ben says what I say but in slightly different words:
"the goal of achieving personal happiness is so imp
Seeing that, Matt, where is the razor definition in my post above??? Work with
that post.
What I mean is the brain sees observations ex.:
the cat ate food
the cat was on a porch all night with candles
the woman
The brain will only store the word "the" once but with a strength of 3 if seen
3 ti
Representations, and short representations, all make a brain smaller in size!
Hence better prediction accuracy.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T37756381803ac879-M1f9c2ca1afba4a3dfb27c4c2
Delivery
Uh, Matt, .So you're saying the "simple but not too simple" isn't
exactly just a way to improve AGI prediction, but much more deeper? Like below?:
1) Short program/brain = 2) better prediction:
Short prediction = 3) less error:
Short error = 4) big paycheck and big happy
Amazing. Note th
One cannot deny that the concept of soul exists. That is the only soul that I
have ever referred to in any related discussion. One may take the position that
concepts don't exist which would be a rather interesting debate.
As far as some real physical soul that is what Minsky thought I was refer
On Monday, June 29, 2020, at 9:13 AM, Matt Mahoney wrote:
> Surely anyone who believes that AGI is possible wouldn't also believe
in souls or heaven or ghosts??? Your brain is a computer, right?
Belief in souls and whatnot is fully compatible with the belief that AGI is
possible, if one avoids ma
It's waaay more than just a witty saying!
Have you read Occam or just that one catchphrase? Science isn't based on
catchphrases.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T37756381803ac879-M799c7a396111771
On Mon, Jun 29, 2020, 4:24 PM wrote:
> But isn't Occam's Razor just basically saying "simple but not too simple"?
> And isn't it razorring off search space to look through when looking for a
> solution?
>
No. A razor is a witty saying. Occam said essentially that simple
explanations are better.
I believe human behavior is estimable. You believe human behavior is computable
IOW has a K-complexity. What's hiding in the difference there? consciousness.
In humans then belief is consciousness. Makes some sense I guess but I think
consciousness is nondeterministic, you think it's determinist
On Mon, Jun 29, 2020, 5:15 PM John Rose wrote:
> On Monday, June 29, 2020, at 11:13 AM, Matt Mahoney wrote:
>
> Surely anyone who believes that AGI is possible wouldn't also believe in
> souls or heaven or ghosts??? Your brain is a computer, right?
>
>
> Matt, do you believe the K-complexity of t
Wow this is really amazing. When you read Occam's writings from the 1300's, and
are familiar with the Catholic belief system you can see the direct
correlations between deity, spirituality (you know, ghosts and souls 'n stuff),
and all of modern mathematics, AIT, logic, science, etc. WOW
Tear down that statue! (Occam that is)
Just figured I'd try to join the zeitgeist... though I do like Occam (Ockham ?)
being from a Catholic upbringing :)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T37756381
On Monday, June 29, 2020, at 11:13 AM, Matt Mahoney wrote:
> Surely anyone who believes that AGI is possible wouldn't also believe
in souls or heaven or ghosts??? Your brain is a computer, right?
Matt, do you believe the K-complexity of the Earth exists? I don't think it
does but perhaps you've
But isn't Occam's Razor just basically saying "simple but not too simple"? And
isn't it razorring off search space to look through when looking for a solution?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T3775
On Mon, Jun 29, 2020, 4:01 PM wrote:
> Is my use of the word Razor correct?
>
No. See
https://en.m.wikipedia.org/wiki/Solomonoff%27s_theory_of_inductive_inference
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/ag
Is my use of the word Razor correct? Occam's Razor suggests "simple but not too
simple", and I'm saying there is many Razors, like updating axon strength
(statistics frequency), blending predictions, learning word embeds (cat=dog),
recency, pleasure/pain words, etc etc all improve prediction by
As Keith Henson once told me, "If Randell Mills's SunCell turns out to
work, it will be the system programmers messing with us."
On Mon, Jun 29, 2020 at 10:14 AM Matt Mahoney
wrote:
> Surely anyone who believes that AGI is possible wouldn't also believe
> in souls or heaven or ghosts??? Your bra
Occam's Razor is true because for all possible probability distributions
over the infinite set of possible theories described by strings, each
theory can only be more likely than a finite set of longer theories. This
is true in any language used to describe the theories.
By "theory" I mean a descr
But Matt, if we use a language that is easiest to compute in our observed
universe, and penalize larger systems, then we are really just leveraging
physics and a penalization. We already know this in the original Occam's Razor:
Leverage physics and make the algorithm as small as possible (but no
Good eye!
1% of Covid-19 victims are serious/critical.
But only maybe 0.5% actually die IF get it.
It's simple, if you're old, stay in doors.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T84b882684c79db5c-M77d68
I second that. A computer can run anything possible not only in our universe
but other physics as well! It can run 3D "situations", 2D, anything. It can
even run quantum mechanics. A computer is universal, it can run any "machine"
that we have, could have, use, or care about! All things, no matt
Surely anyone who believes that AGI is possible wouldn't also believe
in souls or heaven or ghosts??? Your brain is a computer, right?
On Sun, Jun 28, 2020 at 9:46 AM John Rose wrote:
>
> The only interaction I ever had with Minksy was regarding the existence of
> one's soul. My position went al
The problem with Occam's Razor or algorithmic information theory is
that simplicity is language dependent. Any object can be described
using one bit if the language is complex enough. We should prefer
simple languages to avoid this problem, but defining simple languages
just leads to a circular def
steve:
no, the 1% is actually related to hospitalization and risk of death, afaik
(hi, im a long-time lurker on this list)
my friend who is immuno-compromised suffered from covid-19 in february and
lost about 30% of his already-reduced (emphyzema, lifelong smoker in his
50s) but survived and was
So that means, with the real (non-tested) number of cases being probably at
least 10 or maybe 100 times higher, the virus is actually dangerous only to
.01% to .1% of all humans?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.
26 matches
Mail list logo