Isn't self attention about helping translation of the prompt? Ex. 'the dog, it 
was sent to them, food was high quality' and we see yes dog and food can fit 
where it and them are, and, another way to know what it and them mean/are is by 
looking at all previous 30 rare words - they all might match as """dog""", and 
so likely it and them are "dog" category as well. Also, is self attention the 
mirror prediction method for end of prompt as well too? Ex. cat cat cat cat cat 
> _?_ What is the next word here? Cat! So it adjusts the output predictions to 
favor cat. Does Self-Attention in GPT do all this? If not, which contribute?
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tb5526c8a9151713b-M575e7c8650bd7fc54b785ee2
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to