On 9/19/21, immortal.discover...@gmail.com
<immortal.discover...@gmail.com> wrote:
> So we have a context of 9 tic tac toe squares with 2 of your Xs in a row and
> his Os all over the place, you predict something probable and rewardful, the
> 3rd X to make a row. GPT would naturally learn this, Blender would also the
> reward part too, basically.
>
> As for a FORK, this is like two-of favorite meals. Give me some fries.....or
> I could have said Give me some cake. I predict them about 50% each, based on
> how rewardful and popular they are seen in the data. In that case 50% the
> time I choose fries, then next time cake because fries has been inhibited
> and fired its neural energy now, changing the distribution.
>
> It's ok to pursue logic but I can't help but point out this sound exactly
> like my and Transformer AI. In fact, both those are same, simply the
> approach is different to solve the efficiency problem. In this case, I don't
> see how yours would be efficient, it seems like a GOFAI no? Isn't it GOFAI?
> This is not something that scales like GPT, AFAIK your logic based approach
> is focusing on a few rules and disregards how many resources it needs
> (compute doesn't matter, memory neither).
>
> *_How does your approach, to predict B for some context A, be efficient like
> GPT? There is a lot to leverage when given a context, and GPT leverages it.
> Or, if you intend to use Transformer+logic, why? Transformer already does
> all methods you mentioned to leverage context._*

This is the **most important question** concerning the future of AGI,
in my opinion.

GPT-3 is just 1 step away from AGI.

Recently, a company in Beijing built a language model (LM) similar to GPT-3,
called Wu Dao 2.0 (悟道) with 10x the number of weights (1.75 trillion).

BERT or GPT-3 are basically Turing-universal computing modules.

I think for GPT-3 to become AGI, it may need:
1) the ability to do multi-step reasoning, eg. with reinforcement learning
2) the ability to make assumptions, this part may be tricky to do with
neural networks

[ more on this... this is just a partial reply ]

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T74958068c4e0a30f-M8bd85b6e7259c55688d267a3
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to