enwik9.txt...tad better, looking into the leaped one now why it's so high
ICE?
thaws=0.875
leaped=0.8157894736842104
rocks=0.776392352452204
flakes=0.7096774193548383
flee=0.6804564907275323
hook=0.671641791044776
icebergs=0.6470588235294116
snow=0.6382428940568478
snows=0.6346153846153848
jum
Here I'll use some the same words to compare to ICE. And some to RAN as well
(bottom is the top ones from ICE). I guess you can't expect perfection yet,
it's just begun.
ICE?
thaws=1.0
leaped=1.0
flee=0.7912087912087912
snows=0.7142857142857143
flakes=0.7001
rocks=0.6909090909090907
well, not really but the main meat of it is novel I see, and its dead simple
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T01fa5e447808d368-M2ca1fc4437b4e637ae28a0b8
Delivery options: https://agi.topicbox.com/gr
and i invented that algorithm , no musker even mention the darn thing in any
plain site i can seee on the sea *!*
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T01fa5e447808d368-M8951b45813851f324c7a8168
Deliver
and it's not just cuz are common words, proof:
but=0.5512
the=0.31739130434782586
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T01fa5e447808d368-Me1bc8ddeaf1b2c67a7fcd843
Delivery options: https://ag
same setup, more runs; I'm only using right hand 1 word context window, 100MBs
of text (enwik8.txt), 50 word limit for a evidence, me happy it works nicely
already !!
What is similar to RAN?
leaped=1.0002
jumped=0.8367346938775513
flee=0.7582417582417582
sailed=0.7560975609756099
he
I just realized something that [may] be true, as much as shockingly amazing at
how useful the Lossless Compression contest is, I [might] be able to learn my
relations fast if I only do it once, instead of 100 times throughout the
enwik8/9 dataset. The reason for doing it 100 times or just simply
Humans fear covid seemingly because it is a new problem, blood vessel ageing is
even more a problem than covid, I wish people would realize the need to be
pro-active against blood vessel problems. It's the core of heart attacks,
strokes, clots, brain etc health, bacteria infection spreading, med
I'm going with a vocab size of 50,000 like GPT-2 had.
How many relational connections should it learn? 50,000*50,000 is a lot of rare
words trying to relate to other rare words. 5K*5K?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.top
"So, for example, if someone lies in wait to rob someone, but no one passes by,
they are STILL guilty of robbery - pretty much because in some OTHER reality
they actually did rob someone."
Actually a brain may think about murdering, but unless it has a good
probability to activate motors to do
and hole and delay matching
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T01fa5e447808d368-M3d98e0a33d0404c88d37192d
Delivery options: https://agi.topicbox.com/groups/agi/subscription
so much i can do to improve it to.i'm only using the right hand side
first word as evidence, and can improve the too common/rare frequency
evidences, and use bi-directional context evidence, use related words, and
exponential function.and probably more
---
to use code, simply adjust this 2 lines:
elif window[-8:] == ' follow ':
it the item has 10 letters, then set ir from 8 to 10 simply
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T01fa5e447808d368-M3976827f6f94
WORLD?
mom=1.0
universe=0.6903846153846166
place=0.6797093348239249
globe=0.6363636363636364
mother=0.6182873730043544
building=0.5912547528517097
job=0.5871313672922257
person=0.5865272938443675
clothing=0.5483870967741937
dad=0.5454545454545454
program=0.5440956651718993
Earth=0.5413711583924357
coolest video yet, looks alive haha
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tc1185092d444f07f-M308bf39f0a1b490a76e6ce08
Delivery options: https://agi.topicbox.com/groups/agi/subscription
oh, code, ya, ! lol here:
input2 = open('enwik8.txt', 'r', encoding='latin-1').read()
word1words = []
word2words = []
word1counts = []
word2counts = []
for count2 in range(1):
window = input2[count2: count2 + 12]
if window[-8:] == ' mother ':
word = ''
for i in range(50):
dad=0.6363636363636364
guns=0.6175298804780877
jump=0.581699346405229
man=0.5253991291727139
father=0.5137880986937594
person=0.5079825834542818
follow=0.5023183925811436
cat=0.5
home=0.4978229317851957
exit=0.44943820224719094
book=0.44847605224963805
dog=0.4256410256410258
when=0.3207547169811323
What is similar to RAN? Here is a few runs of my program posted below (had to
fix an error).
flee=0.7582417582417582
run=0.6673913043478249
move=0.6478260869565209
program=0.580434782608696
woman=0.578260869565217
mother=0.5521739130434783
book=0.545652173913044
took=0.5108695652173911
exit=0.4831
Yes the thinking over acting using a full humanoid robot instead of
thinking/googling 'acting'.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T9f314873b2a6ed76-Mcb684189427d0506d39a5d71
Delivery options: https:/
what is dominat?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T9f314873b2a6ed76-M9bfe98f412d0ae9863b420ee
Delivery options: https://agi.topicbox.com/groups/agi/subscription
must sleep.works.
input2 = open('enwik8.txt', 'r', encoding='latin-1').read()
word1words = []
word2words = []
word1counts = []
word2counts = []
for count2 in range(1):
window = input2[count2: count2 + 5]
if window == ' win ':
word = ''
for i in range(50):
if i
But @Matt, he asked what if spaces are removed how it'll recognize, what if
't's are removed??? Ex. 'the cat ate food at night' ... 'he ca ae food a nigh'
Can you read wha I am rying o say if I ake away all the ou of his senence I jus
wroe
???!?!??!?!!!
-
Also yes I see in the Foundation Models paper it mentions around page 42 that
GPT etc allow a way to build on GPT to do reasoning, we need GPT. Rest repeats
the same thing, too much text lol, I gave it a try guys just now.
To do the reasoning like if not A then C because A leads to B so it can't
Yes good read, I think many of us already know all this though. BTW you made it
into a blog post but really you only said a few things in there that was your
point.* Only saying this because* the paper I read called foundation models was
way too big, yes big is good but first of all it uses comp
oh ya separate this ->
heybroicanseeyouaremrjojotheponyontheranchhahaseethatyesyouareaspaceape
My tree will store representations ex. word>word>word (using byte pair encoding
segments), I'll use bytes to store a node ex. g>r>8>X>k>... is "wh en will the
bo ok be read"..so when my AI will se
the predictions above show what WILL likely happen magnus, that's why we
like the image.it predicts future
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T607f92b6c1ea30a8-M988682b1324ca653583a7af8
Delive
Say the AI has seen:
walked fast
walked funny
walked slow
Say you turn off Learning and are presented with "walked" now, 3 times, and
must code/score its predictions. The text it runs over is "walked fast, walked
slow, walked funny", to score it. If it predicted letters, it would be f/s
50/50 %
See above. Also, not sure what you are meaning (not enough information), is not
the line the whole prediction of cost and computation? What *are* you saying,
is not the line staying true? What should happen in the image above?
--
Artificial General Intellig
But my question was only asking if the plotted image above really is correct.
Because if so, at least it has a nice rising line...
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T607f92b6c1ea30a8-Mbcf346798848e34
Oops, I had some new abilities ON, which means the no preprocessor mode had a
worse score (because it could not use the new abelites without changing
weights, time taking lol), making it look like a shuffled preprocessor alone
got it done a lot. New tests:
*Let me compare to above's, maybe no c
60+
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tf6dddebe1e89183a-M95e713e1d9d79f51619617ab
Delivery options: https://agi.topicbox.com/groups/agi/subscription
I shuffled the 80, 3840, and the rest of the words in english.dic (keeping them
in 3 groups of 1/2/3 bytes) to simply rid the related words grouping, and got
some interesting scores (decompression successful, exact match). So the
grouping seems to help some amount but without it the preprocessor
I assume you don't mean magic and do understand what GPT "is". As for the non
storing/updating weights, well you need to store memories in a tree at least,
if not a more merging tree like GPT networks (word2vec etc). Because it allows
much faster access. Second, learning/updating weights is the
The thing is that Elon has the motivation and cash to make it happen, that
alone scares people, it is not just talent that scares people, motivation is
what allows learning faster.
Also he said he already has most the parts to make it happen from his car
autopilot system with AI and robotic par
seeing won win succeed only makes the next word a tad likely predicted to be
either those 3 words jsut seen, it helps, it does not solidly say it Will be
win, or all 3...
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/grou
(I'll use a tree, I'm only scanning the enwik8.txt because it was a test...)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T192296c5c5a27230-M829074ddef6b4854dd8e2b3a
Delivery options: https://agi.topicbox.com/gr
What do yous think?
My current code to do it is this but it only so far returns you the right hand
side 1st words that follow, for 2 words to compare (win and won), output is
below the working python code below. You can see in the output that the two
words share many of the words. Now I have to
Yes but I won't be clustering vectors nor using backprop. I will however later
store dog related words under a node though, maybe, which uses a cluster to
quickly learn to relate new words.
Is the dictionary I use from cmix (which turns enwik8.txt from 100MBs to 61MBs)
grouping related words? F
But to today, the line in the plot shown below is steadily rising.I thought
you said it unfortunately is not???
https://www.flickr.com/photos/jurvetson/51391518506/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/gr
Elon Musk is scared lol
also 1:52 LOLOLOL haha 😆
https://www.youtube.com/watch?v=AQNBO6WbQo8
it's closer than you think, yous really are not seeing it, so sad
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tf6ddd
OK so word2vec actually is just comparing 2 vectors to see 2 words relate, it
is not learning words near each other are related.So then my way is novel
and I will be laughing soon, no one has thought of how to do this my way haha.
And the idea I had for predicting things seen nearby a word e
Though he seemed to mean 2019 1,000$ is a human brain computer, then 2029 1$ =
human brain computer..wth Anyway 2029 then = AGI it seems
Maybe my 1000$ computer just isn't enough to run an AGI maybe, we can't figure
it out how to on my computer, but the 2029 computer will be
I can't do the numbers, but by looking at all 3 images, my blunt naive visual
recognition tells me they all say one thing (except kinda that hod lipson one
in black with a way too uppy curve): 2029 we get the power per same dollar to
run a human brain..That's good to hear then. Great.
--
https://www.flickr.com/photos/jurvetson/51391518506/
>From looking at this image, correct me if wrong, there "will" be a steady
>linear increase in Computational power for the following years? It even looks
>a bit exponential. If it were to go flat lined, it would still give us some
>more heigh
here this thread goes with it
https://www.reddit.com/r/singularity/comments/paszbd/122_years_of_moores_law_tesla_ai_update/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T607f92b6c1ea30a8-M8f8cbb6332bae36791cb58d
Ray's 2009 book be all like:
The accelerating returns of computation has been rising exponentially...
Ray's 2022 book be all like:
GPT3 GPT GPT2 ULMFIT BERT XL_GPT OPENAI XXL JUKEBOX DALL-E GPT CLIP
GPT DEEPMIND GOOGLEBRAIN GPT REFORMER ELCTRA BERT_XL ELMO GPT4 DALL-E2
BLEN
oh
This title will be auto-delivered to your Kindle on *September 6, 2022*.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tede26b02e3b1e2f6-M6cd286b92ad9d748574b4d5b
Delivery options: https://agi.topicbox.co
https://www.amazon.ca/Singularity-Nearer-Ray-Kurzweil-ebook/dp/B08Y6FYJVY
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tede26b02e3b1e2f6-M7c11b2e3e8d8e9e4529ed44c
Delivery options: https://agi.topicbox.com/group
And what about this Matt ?? >>
https://www.flickr.com/photos/jurvetson/51391518506/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T607f92b6c1ea30a8-M1dae445739e108fb9216e4c2
Delivery options: https://agi.topicbo
How do you know?
https://www.businesswire.com/news/home/20210824005644/en/Cerebras-Systems-Announces-World's-First-Brain-Scale-Artificial-Intelligence-Solution
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T607f
This is the funniest meme for Elon's Tesla bot, the 2 images at the top are
hilarious:
https://www.reddit.com/r/singularity/comments/p8c9re/the_tesla_bot_is_for_the_weak/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/group
Hehe
I searched a bit and found 2 links worth sharing.
A fully moving robot woman built for 50,000$ in Hong Kong
https://www.youtube.com/watch?v=Qd3QDTPgkOg
A Quara thread:
https://www.quora.com/How-much-would-it-cost-to-make-a-humanoid-robot-with-an-artificial-general-intelligence-that-could-ke
Also,
A good example of needing the context-shared based evidence for related words
is the following example, you see for the first time a new word used
differently, it is not dog, it is "turn/close/screw-on", so the context related
it immediately to the word turn/etc, then at the end you predi
https://roxxcloud.com/cerebras-systems-lays-the-foundation-for-massive-artificial-intelligence/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T607f92b6c1ea30a8-Mf18c4b57ca833bba7918e93f
Delivery options: https://
This is perhaps absurd/ off/ wrong/ whatever, but you could just think how Elon
Musk's plan for the Tesla Bot, could work like this: you ask it to like I think
Elon said "put that nut on that wheel using that wrench", then it has primed
those features in its brain nut, wheel, wrench, and so it h
Hehe. Here is my whole plan below in clear form for all readers. Any feedback
is helpful.
BTW Latest code, scores, text completions, and explanation of all the code can
be found here > https://encode.su/threads/3595-Star-Engine-AI-data-compressor
New Goal:
Translation ability will allow for rec
I didn't see enough goodies to want to read further but what I got so far and
what I Believe so far is this: If box A has 1,000$ and box B has not None in
this case 1,000,000$ IF the genie predicted you will take box B, then taking
both boxes is the best solution since no one can change the boxe
decomposed successfully, exact match
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Td13a829978c4c9f3-M130e7b095b4d6e8d82d98e64
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Keep in mind my program is combing thoudands of contexts and does so by
combining percentages! :[[ !!
This means every relation in ITS BRAIN like dog = cat .dog = man .
book = store .. pump = lock . pump = push .. push =
throw push =
I'm jumping ahead and will finish the last job later.
https://towardsdatascience.com/word2vec-from-scratch-with-numpy-8786ddd49e72
So you know word2vec, right?
Well I plan to do this with much less code in a different way than he did
above. Let's see what happens! I'll implement it by itself to
new score: 19,021,769
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Td13a829978c4c9f3-M4608ccd349f86b5aac3c0efb
Delivery options: https://agi.topicbox.com/groups/agi/subscription
No we need to include program size. So we need a way to turn the scores ex.
0.55059 into a datasize compressed. Well, predicting 50% a guess for which bit
is next, costs you half the storage maybe. So my score above 0.55059 for 10MB
seems to suggest I must store 0.45% of 10MBs..
I'm not rea
SO, I'm unsure how to covert that to compression. HOWEVER, I AM able to simply
score the below, not able to compare to compression metric. The below seem
fine, this is how many letters I predicted correctly, notice the score rises
the more data experience it has:
10MB 0.5505950784146294
1MB
Well, taking the scores above (ex. 0.5505...) (the part failed at and "have to
store" hence 1.0 - 0.5505...) gives me a number like 45,000,000, which is how
many letters of storage of the full 100MB text to store then ("my _loss" or
"failure" to be clear)and it * 0.4152375 (shannon and weave
the 2 images at top are really hilarious, gotta appreciate that...
https://www.reddit.com/r/singularity/comments/p8c9re/the_tesla_bot_is_for_the_weak/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tf6dddebe1e8918
https://electrek.co/2021/08/20/tesla-dojo-supercomputer-worlds-new-most-powerful-ai-training-machine/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tde9d3a394df3146e-M4f6ee3d8469a1f7ce79c375a
Delivery options: ht
First I should focus on 'without preprocessor' scores. It looks good but why
not 80%...something seems off a fair amount.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Td13a829978c4c9f3-Mc4ad4601a627626d4ab60769
nice music with it
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T228aec114c844098-M35c5d97e4bab7f15c35f1145
Delivery options: https://agi.topicbox.com/groups/agi/subscription
PS, they say right above their scores it did nottt train on the files, it was
pre trained, hence cheate. That's what zero-shot means here:
https://openai.com/blog/better-language-models/
--
Artificial General Intelligence List: AGI
Permalink:
https://ag
And why isn't it ~80% predicted correctly, like my lossless compression score?
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Td13a829978c4c9f3-M88e2b55d78d4ebb16b8d295d
Delivery options: https://agi.topicbox.com/
Well, I predict 256 letters, so, it is not 50/50 chance for me, it starts real
bad, it is tough for it, eventually it learns the distribution though!
not using cmix preprocessor, I got
10MB 0.5505950784146294
1MB 0.5310066708037171
100KB 0.47092919877903217
10KB 0.46603868078103533
1K
But why is it under 50%, I mean you could guess random and get 0.5
score.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Td13a829978c4c9f3-M2bdbdc1cef5c64b302e6b06f
Delivery options: https://agi.topic
About 10,000, 100,000, 1,000,000, 10,000,000 bytes:
0.44888181407357763
0.433259276532548
0.35976039165384843
0.43851910590941723
must be an issue somewhere
Well, I guess it was not sure at first, hence the large score, it doesn't
change rapidly the more samples it uses see the upper 3.and
working python code:
# Python program to get average of a list
def Average(lst):
return sum(lst) / len(lst)
# Driver Code
lst = [30, 20, 40, 10, 10]
average = Average(lst)
# Printing average of the list
print("Average of the list =", round(average, 2))
Returns you 22.0
If I'm right, this is
All you'd need to do is make sure you really do have the next letter, and
really take the predicted letter probability, and really do correctly add up
all the predicted probs..??
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topic
Are you saying BPC = Lossless Compression? So openAI had GPT do lossless
compression then to get a BPC score? But I don't think they did lossless
compression...
Doesn't BPC need an arithmetic function? I mean you have a set of predicted
probabilities for each letter (or word/token, which need t
Bro that thing can't even get through your doorway. Elon's looks slim enough
and stark enough like it has a lot in store for it actually helping us. It's
your new watch dog.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/g
I've never seen a logo landing page as expressive of a person as yours, you can
literally see Stefan Reich in the clip.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T2771ec3238f217fa-M6ae2b73f721ef57fcbc0354f
D
new score 19,048,113, not tested decompression
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Td13a829978c4c9f3-M7a4eb27f3385446a3b06dc6f
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Elon Musk presents TESLA bot dojo:
https://www.reddit.com/r/singularity/comments/p7tbeq/tesla_announces_humanoid_robot_tesla_bot/
BD:
I initially thought they were going to swing on the hoops above them.
https://www.youtube.com/watch?v=tF4DML7FIWk
--
Artific
You know the problem though, BPC is probably not actually bits per character,
it's probably something like take the average sum of all cases and half it and
then apply some log2base to it and add an exp to norm it across dataset sizes
that vary.unfortunately...
I swear I seen it one tim
I can't find using Google how BPC (Bits Per Character) evaluation for
prediction works. GPT has a score using this evaluation. It's possible I could
compare my score to GPT's if they can use BPC in the same way.
I need an exact explanation of BPC, no math terminology or "stretch the [P] and
add
I don't have a problem generating text completions yet. And if it does predict
repeations then something I didn't code yet will be getting bored of
repetition. Often you catch my current code do or the cat the cat the
cat but then it breaks out even after 100 times.
Lots of tweaking to do, but got a score of 19,085,594 for enwik8.txt. This puts
me to a new frontier. Decompression not tested but usually it works.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Td13a829978c4c9f
Ya this thread is on Alan's chip implant vaccine sabatoge and NKT replied to it
lol.it's an experimental drug, only take it if you have real big sh*t at
stake like you are an actress that is raking in 500Million a year and could
lose your life. I didn't take it, I stay at home mostly, and I'
I'm not sure what you're trying to say NKT, I'm here and listening if you have
a way to make AGI or solve death.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tace3f9aea35af378-Mbd6adc127522fed475cd81bc
Delivery
Darn those little nutacases that remove precious text. Here it is again back
from the dead!! And I shall save it to my computer, now!:
https://www.socialgrep.com/search?query=ilya%20sutskever
--
Artificial General Intelligence List: AGI
Permalink:
https://a
"One of my favorite interview questions is to ask candidates to explain
perplexity or the difference between cross entropy and BPC. While almost
everyone is familiar with these metrics, there is no consensus: the candidates'
answers differ wildly from each other, if they answer at all."
BTW do you know why Alex and Byron have the only scores of 15MB but can't
generate completions? Byron says his is complex and can't stop it from
repeating itself. Byron does use others algorithms plus his own, so it's a
freak a thing to start with. Alex, how big is his code in linesmany
que
I like the part where he just about copy pasta what I say haha
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tace3f9aea35af378-Mead0919d66ceb0555b929edd
Delivery options: https://agi.topicbox.com/groups/agi/subsc
Not black winter but summer shine:
https://www.reddit.com/r/singularity/comments/p5uh1r/openais_chief_scientist_ilya_sutskever_comments/
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tace3f9aea35af378-Mde937da57c
Ya well OpenAI requires you to go to their building to be part of the team, and
I'm not doing that. I'm going to stay safe in my home. California could fall
into the ocean BTW, better I stay away from the darn place. Once they get more
intelligent they can hit me up online. They actually current
(Decompression tested successfully)
My code that scored 19.46MB for the 100MB enwik8.txt, compared to 15MB champion
zone I must reach, had losslessly compressed the following:
100,000 bytes
1,000,000 bytes
24,970
224,646
Note only 10+MB matters since I'm using a pre-processor that is compressed
I don't think at such a stage proposed, that nanobots will still believe in god
Matt. . . . . that would not fit in with having a world full of alive metal
ugborgs made of super powered batteries.it's one way or the other
--
Artificial General Intel
Yes I have 1-4 really important, simple, solid things you can do to GPT
(preferably DALL-E since it is better than any GPT) to make it much more
alive/AGI. And no one talks about these things. I am also learning GPT now in
case my architecture fails.
I'll invite you to my discord E-Merge where
on of my nicknames is genai
genai vs gazai :)
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tad609058b80e9d1d-Mebd0f71f7684ce95bcab08ca
Delivery options: https://agi.topicbox.com/groups/agi/subscription
I.E there is some big government men that have lots of bank power and ant power
men and they guide the media to control all and control all technologies/future
evolution...
But there is sure a lot of intelligent seasoned AI programmers, they are key
people too.control media, get money, gain
Need more data
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/Tf7437d2f3dfc5a00-M02d5945b1dca1c761e5764db
Delivery options: https://agi.topicbox.com/groups/agi/subscription
Ah so most it trained on is webpage data, simple problems..and that's why
they show such on openAI's presentation maybe, I think they did thatno need
to read the link...i did it 4 u
https://www.lesswrong.com/posts/ib9bfyJiz4FLuHDQs/openai-codex-first-impressions
-
Perplexity might be a bit faster to run speed wise though, but not a killer, LC
is linear, ex. I think it may take a few mins or 10mins for 1GB, so what,
models take hours to train no 1GB.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi
801 - 900 of 2766 matches
Mail list logo