The OpenCog atomspace was the data structure to hold the knowledge base,
but it was never filled with knowledge. We have no idea how it would
perform when it was filled with sufficient data for AGI, or how we would go
about filling it, or how much effort it would take, or even how big it
would have to be.

That was Cyc's downfall. Lenat had no idea how many rules it takes to
encode common sense, or even natural language understanding that he
attempted to add on as an afterthought. He had a group that encoded
millions of rules in Cycl, which proved to be unworkable. We do have some
ideas from LLMs that the true number is in the billions.

The problem is deceptive because you seem to get half way there with just a
few hundred rules, just like you can cover half of a language model with
just a few hundred word dictionary and a few hundred grammar rules.

On Fri, May 3, 2024, 6:01 PM Mike Archbold <jazzbo...@gmail.com> wrote:

> I thought the "atomspace" was the ~knowledge base?
>
> On Fri, May 3, 2024 at 2:54 PM Matt Mahoney <mattmahone...@gmail.com>
> wrote:
>
>> It could be that everyone still on this list has a different idea on how
>> to solve AGI, making any kind of team effort impossible. I recall a few
>> years back that Ben was hiring developers in Ethiopia.
>>
>> I don't know much about Hyperon. I really haven't seen much of anything
>> since the 2009 OpenCog puppy demo video. At the time it was the culmination
>> of work that started with Novamente in 1998. Back when I was still
>> following, Ben was publishing a steady stream of new ideas and designs,
>> which typically has the effect of resetting any progress on any large
>> software project back to the beginning. OpenCog was a hodgepodge of a hand
>> coded structured natural language parser, a toy neural vision system, and a
>> hybrid fuzzy logic knowledge representation data structure that was
>> supposed to integrate it all together but never did after years of effort.
>> There was never any knowledge base or language learning algorithm.
>>
>> Maybe Hyperon will go better. But I suspect that LLMs on GPU clusters
>> will make it irrelevant.
>>
>> On Wed, May 1, 2024, 2:59 AM Alan Grimes via AGI <agi@agi.topicbox.com>
>> wrote:
>>
>>> .... but not from this list. =|
>>> 
>>> Goertzel explains his need for library programmers for his latest
>>> brainfart, I think his concept has some serious flaws that will be
>>> extermely difficult to patch without already having agi... Yes, they are
>>> theoretically patchable but will said patches yield net
>>> benefits?.........
>>> 
>>> But, once again, it must be restated with the greatest emphasis that he
>>> did not consider the people on this list worth deiscussing these job
>>> opportunities with. It should also be noted that he has demonstrated a
>>> strong prefferance for third world slave labor over professional
>>> programmers who live in his own neighborhood.
>>> 
>>> https://www.youtube.com/watch?v=CPhiupj9jyQ
>>> 
>>> --
>>> You can't out-crazy a Democrat.
>>> #EggCrisis  #BlackWinter
>>> White is the new Kulak.
>>> Powers are not rights.
>>> 
>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> +
> delivery options <https://agi.topicbox.com/groups/agi/subscription>
> Permalink
> <https://agi.topicbox.com/groups/agi/Tb63883dd9d6b59cc-M4a170f003b4c0d53eb85a8ba>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tb63883dd9d6b59cc-M6e40226253668c8fbda665a6
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to