Yeah, forgot to mention that the latest version of nncp improves
compression by 2% and is 3x faster (2 days per GB) and uses 6 GB RAM vs 23
GB. Mostly just optimizing. nncp uses a transformer neural network running
on a GPU.
Again reconfirming the dominance of CPU heavy neural networks over rule
b
On Monday, April 26, 2021, at 7:18 PM, immortal.discoveries wrote:
> Every time I run my own code it feels like playing the casino lottery lol.
> Jackpot lol.
By jackpot I don't mean the money, obviously it is a small earnings for the
Hutter Prize website, and still it is not the cash for me that
Either JR is just joking in every post or he really believes what he says. Both
are not good cases... My suggestion - build a AI like GPT-2 / PPM, and see for
yourself how one exactly works, to get out of this qualia bubble your seriously
stuck in (clearly you are, yes, I'm telling you, I've bee
On Wednesday, April 28, 2021, at 11:55 AM, immortal.discoveries wrote:
> What matters here is brains can solve many problems by predicting solutions
> based on context/ problem given
Single brains specialize. Multibrains generalize. That's why they communicate.
Multiparty intelligence on a compu
On Wednesday, April 28, 2021, at 12:01 PM, Jim Bromer wrote:
> Malleable compression is an interesting way to put it.
Well, we could reframe the concept of compression and redefine it in terms of
consciousness and intelligence.
Assume panpsychism, all compressors have non-zero consciousness. And
Malleable compression is an interesting way to put it.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tbdfca102d702de94-Mb517e1b5a5fc923ff46099fc
Delivery options: https://agi.topicbox.com/groups/agi/subscription
John says you can compress sometimes horribly/ nothing, sometimes ace it, but
not all the time ace all the thing. Here's my answer: Given 100 letters of
context that are random, yes a smart brain will fail here because it is random,
and given 100 letters all 'a' ex. 'aaaa' it will ac
On Wednesday, April 28, 2021, at 9:24 AM, Jim Bromer wrote:
> I do not think that "compression" per se is the basis of making AI (which is
> directly related to the topic). However, I do believe that an AGI (or an
> advanced AI) program would like a compressor.
I'm with you there Jim, unlike som
In response to something I said about cross-generalization, John Rose replied "
You can optimally compress some of the data all of the time, you can optimally
compress all of the data some of the time, but you can’t optimally compress all
of the data all of the time. It is what it is bruh."
Gen
On Tuesday, April 27, 2021, at 5:51 PM, Jim Bromer wrote:
> There is something about cross-generalizations and cross-categorizations or
> overlapping insights about concepts (or components of concepts) that make me
> think of a compression method that would not be an optimal compressor for a
> n
10 matches
Mail list logo