Hmmm...

About this "OpenAI keeping their language model secret" thing...

I mean -- clearly, keeping their language model secret is a pure PR
stunt... Their
algorithm is described in an online paper... and their model was
trained on Reddit text ... so anyone else with a bunch of $$ (for
machine-time and data-preprocessing hacking) can download Reddit
(complete Reddit archives are available as a torrent) and train a
language model similar or better
than OpenAI's ...

That said, their language model is a moderate improvement on the BERT
model released by Google last year.   This is good AI work.  There is
no understanding of semantics and no grounding of symbols in
experience/world here, but still, it's pretty f**king cool to see what
an awesome job of text generation can be done by these pure
surface-level-pattern-recognition methods....

Honestly a lot of folks in the deep-NN/NLP space (including our own
SingularityNET St. Petersburg team) have been talking about applying
BERT-ish attention networks (with more comprehensive network
architectures) in similar ways... but there are always so many
different things to work on, and OpenAI should be congratulated for
making these particular architecture tweaks and demonstrating them
first... but not for the PR stunt of keeping their model secret...

Although perhaps they should be congratulated for revealing so clearly
the limitations of the "open-ness" in their name "Open AI."   I mean,
we all know there are some cases where keeping something secret may be
the most ethical choice ... but the fact that they're willing to take
this step simply for a short-term one-news-cycle PR boost, indicates
that open-ness may not be such an important value to them after all...

-- 
Ben Goertzel, PhD
http://goertzel.org

"Listen: This world is the lunatic's sphere,  /  Don't always agree
it's real.  /  Even with my feet upon it / And the postman knowing my
door / My address is somewhere else." -- Hafiz

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T581199cf280badd7-Mc9e2402e2362a6c5e88b10f9
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to