When an AGI, at its own volition, using a behavioural rule set for a
situation, which ruleset it developed from experience alone - would
recognize it's own mistake and be able to make corrections - it would have
demonstrated a notion of "understanding".

In practice, this would be 1 step short of problem solving. As such, it
would rather be restricted to problem identification and engagement, in the
sense of demonstrating autonomous, situational awareness.

IMO, problem solving pertains mostly to competency and performance
(training based) and not spontaneous awareness.
On 9 Jul 2021 22:03, "Mike Archbold" <jazzbo...@gmail.com> wrote:

> You've got an opinion. We all do!
> 
> I'm doing a survey of opinions about "understanding" for the meetup
> --> https://www.meetup.com/Northwest-Artificial-General-
> Intelligence-Meetup-Group/
> 
> 2 events are envisioned this summer:
> 
> 1)  Survey -- discuss tribal opinions in the AGI community  as well as
> published works about what "understanding" means for a machine,
> 
> 2) Critiques and Conclusions -- compare, generalize, hopefully reach
> some conclusions.
> 
> So what is your definition of "understanding"? I have collected about
> a dozen so far and will publish along with the events.
> 
> We are also having an in person event this month for those around
> western Washington:
> 
> https://www.meetup.com/Northwest-Artificial-General-
> Intelligence-Meetup-Group/events/279258207/
> 
> Thanks Mike Archbold

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-Mb6b1b12fe8558bf1f8ca6896
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to