On Wed, Jun 29, 2022 at 11:11 PM Boris Kazachenko <cogno...@gmail.com>
wrote:

> On Wednesday, June 29, 2022, at 10:29 AM, Rob Freeman wrote:
>
> You would start with the relational principle those dot products learn, by
> which I mean grouping things according to shared predictions, make it
> instead a foundational principle, and then just generate groupings with
> them.
>
>
> Isn't that what backprop does anyway?
>

They may use a pre-learned relational principle in a sense. Pre-training,
you say?

But they then revert to back-prop again?

That's fine. I guess back-prop would go on to learn hierarchy, just as it
learned the grouped predictions at any "pre-training" level.

But that is not quite the application as a foundational principle I
was talking about. If you actually, actively, substitute things which share
predictions, that's a little more foundational than just using initial
groupings as a basis for more back-prop.

You could do either. I'm suggesting that if you end up getting a different
grammar for each sentence, the second way, just actively substituting
things which share predictions, and not doing more back-prop on initial
groupings, is a more efficient way to do it. Because going the active
substitution way just generates groupings for each sentence as you go.
Back-prop is trying to optimize over the whole data-set. If there are not
actually that many global optimizations, if there are actually an infinite
number of chaotically expanding, global optimization attractors, it becomes
horribly inefficient.

It's a hypothesis anyway. I don't know if anyone has looked at any
hierarchy generated by a transformer, as in the paper I linked, and checked
to see if it is a different hierarchy for each sentence.

Personally, the fact these things are just churning out more and more,
billions, of parameters, is a hint to me.

But I don't know if anyone has checked.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5d6fde768988cb74-Md02add195b2316d212728693
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to