there are implicit structures, however
too much "embedded" in the whole, and the low level one is not well "grounded"
to the actual other-modalities input which it represents, in the typical LLMs.
The human language, for the human agents, is multimodal, multirange,
mul
Costi: "Todor you are appointed to the Ministry of Removal for merits with
spotting all those MIT agents crawling between Plovdiv and Beijing."
What a BS insult. Do you mean the agents crawling and talking in your
head? :) Who are you dude, what have you done or created (first in anything or
Costi, I think your reasoning is right as of the importance of the multi-modal
input for generalisation. As of ANN (as popular), they were explicily mentioned
only in several slides in the lecture about Narrow AI and why it failed, with
some hopes given to Schmidhuber's LSTM. Slides 34-36:
htt
Stefan - a course that was not about people with ADHD and trolling guys. :)
The University short intro was:
*Анотация:* *Целта на курса е въведение в теориите на
разума.Курсът е предназначен за студенти, които искат в бъдеще да се
занимават с авангардната област на Универсалния изкуствен интеле
Costi, it was namely about "unsupervised learning" as you could see from the
course program or the intro.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T5417cc95d981211e-Mea7bc10f84541064ffa8f29a
Delivery optio
Hello, I guess you have visited the MIT's AGI course from 2018 and follow the
AI podcast by Lex Fridman. I suspect that many believe that it was the first
university course in AGI. However the first interdisciplinary AGI course was
presented 8 years earlier by me (it was announced here); Ben Go