Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-17 Thread immortal . discoveries
;p Ya but in the end poem or not it is just predicting the past from matching:

_why_ did the turtle cross the road ?

the turtle was hungry and saw some taccos steamin hot, it crossed the road
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M3defcec99bcdc6c7cfe3d60c
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-16 Thread Mike Archbold
On 7/16/21, doddy  wrote:
> to understand is to explain why something is the way it is.
> is to be able to explain why you did something.
> is to be able to explain why others did things.
>

Thanks. I like this one -- it's like a poem


>
> On Thu, Jul 15, 2021 at 11:28 AM Brett N Martensen 
> wrote:
>
>> Gadi Singer, VP of Intel Labs, who leads their cognitive computing
>> research has a definition of understanding at 16:00 minutes into his
>> recent
>> video presentation. https://www.youtube.com/watch?v=aqfizAySe0E=1538s
>> The video is worth listening to all the way through.
>> *Artificial General Intelligence List *
>> / AGI / see discussions  +
>> participants  +
>> delivery options 
>> Permalink
>> 
>>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M422b903ab8c428eb7479a414
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-16 Thread Mike Archbold
On 7/15/21, Brett N Martensen  wrote:
> Gadi Singer, VP of Intel Labs, who leads their cognitive computing research
> has a definition of understanding at 16:00 minutes into his recent video
> presentation. https://www.youtube.com/watch?v=aqfizAySe0E=1538s
> The video is worth listening to all the way through.

Thanks. I will use this:

Understanding is about the ability to create a worldview expressed
with rich knowledge representation …. to be able to model inside the
machine, inside the person, the complex world that is outside. The
other capability of understanding is to acquire and interpret new
information. And to enhance and continuously update this worldview
because the the world around us is very dynamic. The interests are
growing. So part of understanding is the ability to continuously
update this internal view inside the machine or on us of what is
happening outside. And the third is the ability to effectively reason
and explain both our existing knowledge. As well as in real time, new
information coming from the outside.

Gadi Singer
source: https://www.youtube.com/watch?v=aqfizAySe0E


> --
> Artificial General Intelligence List: AGI
> Permalink:
> https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M99504e2f8f6e9a989d27e862
> Delivery options: https://agi.topicbox.com/groups/agi/subscription
>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M3b23abf9e0e32967993aa9ab
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-16 Thread doddy
to understand is to explain why something is the way it is.
is to be able to explain why you did something.
is to be able to explain why others did things.


On Thu, Jul 15, 2021 at 11:28 AM Brett N Martensen 
wrote:

> Gadi Singer, VP of Intel Labs, who leads their cognitive computing
> research has a definition of understanding at 16:00 minutes into his recent
> video presentation. https://www.youtube.com/watch?v=aqfizAySe0E=1538s
> The video is worth listening to all the way through.
> *Artificial General Intelligence List *
> / AGI / see discussions  +
> participants  +
> delivery options 
> Permalink
> 
>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M1c838df3aecbfb7bbe6849df
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-15 Thread Brett N Martensen
Gadi Singer, VP of Intel Labs, who leads their cognitive computing research has 
a definition of understanding at 16:00 minutes into his recent video 
presentation. https://www.youtube.com/watch?v=aqfizAySe0E=1538s
The video is worth listening to all the way through.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M99504e2f8f6e9a989d27e862
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-12 Thread Quan Tesla
When an AGI, at its own volition, using a behavioural rule set for a
situation, which ruleset it developed from experience alone - would
recognize it's own mistake and be able to make corrections - it would have
demonstrated a notion of "understanding".

In practice, this would be 1 step short of problem solving. As such, it
would rather be restricted to problem identification and engagement, in the
sense of demonstrating autonomous, situational awareness.

IMO, problem solving pertains mostly to competency and performance
(training based) and not spontaneous awareness.
On 9 Jul 2021 22:03, "Mike Archbold"  wrote:

> You've got an opinion. We all do!
> 
> I'm doing a survey of opinions about "understanding" for the meetup
> --> https://www.meetup.com/Northwest-Artificial-General-
> Intelligence-Meetup-Group/
> 
> 2 events are envisioned this summer:
> 
> 1)  Survey -- discuss tribal opinions in the AGI community  as well as
> published works about what "understanding" means for a machine,
> 
> 2) Critiques and Conclusions -- compare, generalize, hopefully reach
> some conclusions.
> 
> So what is your definition of "understanding"? I have collected about
> a dozen so far and will publish along with the events.
> 
> We are also having an in person event this month for those around
> western Washington:
> 
> https://www.meetup.com/Northwest-Artificial-General-
> Intelligence-Meetup-Group/events/279258207/
> 
> Thanks Mike Archbold

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-Mb6b1b12fe8558bf1f8ca6896
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-12 Thread Daniel Jue
Mike, I'll save you the trouble, I just typed it up this morning for
my zettelkasten.  There may be some typos and I don't include the
references.
Cheers

Feeling, Thinking, Knowing

By [[louis-arnaud-reid]]

In an article on [[carl-jung]], James Hillman writes that at the end of the
century " there were no clear distinctions among the various components of
the mind which had been grouped, or discarded, in that bag called 'the
affective faculty'. From the time of the Enlightenment in Germany, the soul
was divided into three parts: thinking, willing and feeling.
Fundamentally, this third region of the psyche, like Plato's third class of
men, was inferior".  Furthermore, this bag of feelings was always in
opposition to thinking, or as Moses Mendeslssohn said: "We no longer feel
as soon as we think ''.  Hillman adds that the opposition between thinking
and feeling is still found in the scientistic psychology of head without
heart, and the romantic psychology of heart without head.

'We no longer feel as soon as we think'.  This might be supplemented by
other over-simple generalizations: 'As soon as we think we make
statements':  'To claim to know anything entails (inter alia) being able to
state it clearly': 'Knowledge is expressed in the form "that-p"; what
cannot be so expressed is not knowledge'.

These are, doubtless, oversimplified--through the paradigm 'that-p' is
normally taken for granted in epistemological discussions: moreover, this
is an assumption which has dominated western thought.  On the other hand
philosophers and plain men constantly give cognitive nuances to
feeling-words and ideas, suggesting 'feelings' and 'intuitions' which have
something of a cognitive-claiming character.  We at all times speak of
cognitive 'feelings' about moral and humanitarian matters, about people,
works of art, political decisions.  Ryle listed different uses of the word
'feeling', some of which are cognitive-claiming.  Jung used 'feeling' in
various cognitive ways, but very pragmatically and often inconsistently.
'Feeling' is used cognitively by philosophers--existentialists,
phenomenologists ... by James, Bradley, Alexander, Whitehead, MacMurray,
Langer ... by Gestalt psychologists -- in a large variety of ways.  In the
twenties (and more recently), I argued for feelings as cognitive.  Very
much  the same line was taking in the fifties about 'emotion' -- often too
easily run together with feeling, as if the two were identical.  There was
a sort of rediscovery of the cognitive element intrinsic to emotion -- by
Bedford, J.R. Jones, Kenny, Peters, Mace ... 'Emotions', it was argued,
cannot be understood merely in terms of internal ongoings, but are integral
with cognition and behavior relevant to the situations in which they
arise.  Peters' term 'appraisal' (cognitive appraisal) of the situation,
sums up the best of it.  (It is substantially the same as what I was trying
to say but of feeling, in the twenties.)

Although 'feeling' has been so variously used, and there is a large
literature of emotion, the dominance of the 'that-p' concept of knowledge
has been so powerful and sustained that it has tended to force the
reduction of other sorts of knowing to its own pattern.  This is
particularly questionable in claims to know values -- moral, personal ,
aesthetic -- where, though knowledge-that may have it's important place, it
is not everything.  Feeling there at least seems to have an essential
cognitive part to play.  I can do no more than suggest nowe that neglect of
that is not only detrimental in philosophy itself, but has serious moral,
social and educational consequences as well.  I believe the time is more
than ripe for a fresh look at the relations between feeling and knowing.

2

The treatment of feeling by many  'straight' psychologists earlier in the
century was thin. (And in some later textbooks, it is perhaps not
surprising that writers self-blinded by wholly behavioristic approaches,
have not even mentioned 'feeling' in their indexes!)  Earlier, feeling was
mainly regarded as a hedonic tone-- sometimes as the tone of emotion --
with a range from the positively pleasurable to the (negatively)
unpleasurable.  Whether there could be 'neutral' feeling in between, was a
matter of controversy.  Feeling was sharply distinguished from the
character of the concrete mental states of which it was the character or
quality.  And it was said to be intransitive, and definitely
non-cognitive.  Feeling (some held) does not even 'know' facts of inner
occurrences" feeling is 'the way you feel'.

I remarked on the habit of running the words 'feeling' and 'emotion'
together, as though distinctions don't matter here.  They do.  But since
this paper is on feeling, not emotion, I cannot say anything adequate now
about emotion and its relation to and distinction from feeling.  I can only
affirm, without supporting argument, that actual emotions are largely
episodic, whilst feeling, I shall suggest, is underlyingly present 

Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-11 Thread immortal . discoveries
(Well, RL/ reflexes are in a sense learnt patterns really. And yeah, I consider 
RL sensory, motor is just motors, totally controlled by senses, no RL for them 
to work with.)
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M3cbffa04e5ed0fb51477c63c
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-11 Thread immortal . discoveries
Predicting truth is one thing, like GPT-2. Predicting hope is another thing, RL.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M2e85d130176ba31d16665c73
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-11 Thread immortal . discoveries
Text/image/sound prediction is guided by reward, so if you make it favor the 
word 'cheese', it will tend to predict cheese a lot more. It's just a simple 
adjustment permanently to the weights. It makes you say something for no 
reason, just instinct - to solve global warming, using fries, sleep, and women 
(all the things you love), because they are 100% 'awesome' triggers. So next 
time you think you see a carrot, you probably think it is a French Fry a bit, 
cuz that neuron is already pre-active all the time (well, in cycles), so, you 
expect fries! Not carrots.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M507b42dfc675a1a9779194e7
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-11 Thread Mike Archbold
On 7/9/21, Daniel Jue  wrote:

>
> In a 1977 paper by Louis Arnaud Reid in the Proceedings of the Aristotelian
> Society #77 "Thinking, Feeling, Knowing" doi 10.1093/aristotelian/77.1.165
> , the author gives a (IMO) great argument about how "feeling" (not
> necessarily those marked by noticeable hedonic tones) are an inseparable
> part of our "knowing" something, both before and after we proclaim to know
> that something.
>
>


I will have to get this paper. Whenever I think too much about
understanding I come up against this problem that understanding is (at
least in part) a *feeling*. Thanks!

>
>
> On Fri, Jul 9, 2021 at 2:03 PM Mike Archbold  wrote:
>
>> You've got an opinion. We all do!
>>
>> I'm doing a survey of opinions about "understanding" for the meetup
>> -->
>> https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/
>>
>> 2 events are envisioned this summer:
>>
>> 1)  Survey -- discuss tribal opinions in the AGI community  as well as
>> published works about what "understanding" means for a machine,
>>
>> 2) Critiques and Conclusions -- compare, generalize, hopefully reach
>> some conclusions.
>>
>> So what is your definition of "understanding"? I have collected about
>> a dozen so far and will publish along with the events.
>>
>> We are also having an in person event this month for those around
>> western Washington:
>>
>> https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/events/279258207/
>>
>> Thanks Mike Archbold
> 
> 
> --
> Daniel Jue
> Cognami LLC
> 240-515-7802
> www.cognami.ai

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-Me1b9dbbfa57091e9cba7804b
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread cognomad
Understanding is recognition. Which is a lossless component of compression from 
comparing input to template.
That's all there is to it, and it applies to countless other words that mean 
the same thing. 
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-Mb2458ed6f05f306b029ebe73
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread Mike Archbold
Thanks everybody. I will be publishing my survey w/ the event. Peter
-- I remembered your work on understanding and the link Thank you.

On 7/10/21, Roman Kofman  wrote:
>  Understanding is the skill of the machine. Machine X understands Y if it
> can determine the completeness and consistency of Y according to its
> internal application domain model. The AGI application area is the real
> world. In simple words, understanding is the ability to determine whether
> something can be.
>
>  How can a machine understand?
>
>  The machine generates a real-time representation (model) of the
> current situation (CSR).
>
>   Determines whether all the values of the CSR elements correspond to
> each other.
>
>   If they do, then the machine understands the situation and acts in
> accordance with it.
>
>   If not, then he does not understand the situation, identifies
> non-conformities and acts to eliminate the non-conformities.
>
> On Sat, Jul 10, 2021 at 4:12 PM  wrote:
>
>> Understanding is the editing distance between data points and the
>> distance
>> of objects on a traveling map,
>> done by a conscious mind.
>> *Artificial General Intelligence List *
>> / AGI / see discussions  +
>> participants  +
>> delivery options 
>> Permalink
>> 
>>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-Mc52adaf91a317cd96cfa6d39
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread Roman Kofman
 Understanding is the skill of the machine. Machine X understands Y if it
can determine the completeness and consistency of Y according to its
internal application domain model. The AGI application area is the real
world. In simple words, understanding is the ability to determine whether
something can be.

 How can a machine understand?

 The machine generates a real-time representation (model) of the
current situation (CSR).

  Determines whether all the values of the CSR elements correspond to
each other.

  If they do, then the machine understands the situation and acts in
accordance with it.

  If not, then he does not understand the situation, identifies
non-conformities and acts to eliminate the non-conformities.

On Sat, Jul 10, 2021 at 4:12 PM  wrote:

> Understanding is the editing distance between data points and the distance
> of objects on a traveling map,
> done by a conscious mind.
> *Artificial General Intelligence List *
> / AGI / see discussions  +
> participants  +
> delivery options 
> Permalink
> 
>

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M8116d61331dc3b716ee9c296
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread keghnfeem
Understanding is the editing distance between data points and the distance of 
objects on a traveling map,
done by a conscious mind.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M7ff50e3ab45f1b051d480f6e
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread immortal . discoveries
On Saturday, July 10, 2021, at 2:25 AM, Daniel Jue wrote:
> In an analogy, commonly seen in optical illusions, a layperson cannot 
> understand nor explain why we see black dots on a grid illusion.  Certain 
> pattern matchers which were triggered in the optical processing region are 
> not able to be brought into question or simulation.  There are areas of our 
> own being which we are helplessly unable to understand via articulation.  By 
> this I argue that an AGI could (should?) be average-human-level in 
> understanding without the need of "understanding" it's full being.
Most illusions are just you predicting. The skinny table is seen when vertical, 
even though is same size on the paint editor after rotating it, cuz you saw in 
real life it to be smaller if farther. Gears move on their own cuz you saw them 
rotate in real life, especially if you move your eyes on the image to trigger 
the motion neuron. A girl near a car will make it look attractive in 
commercials.

There are reflexes that model patterns, to help you survive. They too are 
patterns.
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M23b001e2ea6c4f83716d7bc7
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread immortal . discoveries
On Friday, July 09, 2021, at 8:18 PM, Peter Voss wrote:
> https://chatbotslife.com/understanding-understanding-9dcc15759b5b 
> 
I don't know if you know this and simply gave many views of the same thing, but 
just in case not: All of the listed points in that link are the same thing, the 
knowing of English or facts or a person's habits is the memories or memories of 
what comes at the end of some context (slept in the > ?), which are just 
memories. The saying it in your own words is hole and delay matching 
adaptiveness, and translation matching as I explained earlier. The other one 
point, summarizing it, is too but it uses the most common patterns and the ones 
that relate, basically, and only one of each thing so to not repeat same thing. 
You can think of summarization as 100 words that match a 10 word memory, it can 
see by delayed matching etc that it is there in the 100 words, and once it sees 
one part of the memory it knows it is finished said and done (so to not repeat 
the same things in the summarization).
--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M010c5e968e8de5f8bf4012ef
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread Matt Mahoney
You understand a sequence of symbols if you can predict or compress them.
If I wanted to test if you understand Chinese, I would show you some
Chinese text and test how many characters you could guess next.

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-Mdb0388f75bc452bfafca5273
Delivery options: https://agi.topicbox.com/groups/agi/subscription


Re: [agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-10 Thread Daniel Jue
Thanks for the question Mike.

My opinion on understanding, which is still evolving:

Understanding is one aspect of a conscious being, closest to knowing, but
also including thinking, feeling, intuition and emotion.  Some claim these
are inseparable, or that separating them limits the types of knowledge that
can be represented, and I tend to gravitate in that direction.

We humans can know things we can't articulate, and we can be articulate
about things that are not really what we believe internally.
"Understanding" as a test for AGI might be a gentleman's agreement upon a
certain level of articulation being reached for the idea at hand.  We see
the first inklings of this in the XAI endeavors.  However, some if not all
understanding comes before the language needed to express it, and I think
this applies to both humans and AGI; therefore a more rudimentary framing
of understanding could be knowledge which gives the capability to inform an
action.  Still though, my opinion is that understanding, or realization of
knowing, is deeply intermingled with an idea of feeling.  Understanding may
be inseparable from other types of cognition.

In any case, not taking advantage of that capability to inform an action
can cause non-ideal situations for the individual, such as how I knew it
would rain today but did not think to roll up my car windows.
It could also be evident in pathologies where we say things we don't
believe, or believe things we wouldn't say: both of those situations give a
sense that feelings, possibly repressed, may be involved.

In our AGI system, the most fundamental type of understanding is identical
to knowledge which could be traced back to learning of patterns by a type
of Hebbian function.  In common parlance, we think of understanding as a
type of knowledge orders of magnitude higher than the level of sensory
spacetime co-occurrence; however the Hebbian learning analogues continue at
recursively higher levels on the pattern matchers themselves. This is
something still under development. One of the many challenges we're facing
here is the level at which we allow higher level patterns to be brought
into the simulation space for manipulations (thinking about it's own
understanding of the pattern).  The lower levels of pattern recognition are
not of the same types as the higher level patterns, and the difference
between them delineates the conscious from the subconscious.  We also have
not incorporated the concepts of hedonic feeling, however there are some
simple proxies for intuition based in familiarity with a pattern.

In an analogy, commonly seen in optical illusions, a layperson cannot
understand nor explain why we see black dots on a grid illusion.  Certain
pattern matchers which were triggered in the optical processing region are
not able to be brought into question or simulation.  There are areas of our
own being which we are helplessly unable to understand via articulation.
By this I argue that an AGI could (should?) be average-human-level in
understanding without the need of "understanding" it's full being.

In a 1977 paper by Louis Arnaud Reid in the Proceedings of the Aristotelian
Society #77 "Thinking, Feeling, Knowing" doi 10.1093/aristotelian/77.1.165
, the author gives a (IMO) great argument about how "feeling" (not
necessarily those marked by noticeable hedonic tones) are an inseparable
part of our "knowing" something, both before and after we proclaim to know
that something.




On Fri, Jul 9, 2021 at 2:03 PM Mike Archbold  wrote:

> You've got an opinion. We all do!
> 
> I'm doing a survey of opinions about "understanding" for the meetup
> -->
> https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/
> 
> 2 events are envisioned this summer:
> 
> 1)  Survey -- discuss tribal opinions in the AGI community  as well as
> published works about what "understanding" means for a machine,
> 
> 2) Critiques and Conclusions -- compare, generalize, hopefully reach
> some conclusions.
> 
> So what is your definition of "understanding"? I have collected about
> a dozen so far and will publish along with the events.
> 
> We are also having an in person event this month for those around
> western Washington:
> 
> https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/events/279258207/
> 
> Thanks Mike Archbold


-- 
Daniel Jue
Cognami LLC
240-515-7802
www.cognami.ai

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M7b79adfd1946b918a132547a
Delivery options: https://agi.topicbox.com/groups/agi/subscription


[agi] How does a machine "understand"? What is your definition of "understanding" for an AGI?

2021-07-09 Thread Mike Archbold
You've got an opinion. We all do!

I'm doing a survey of opinions about "understanding" for the meetup
--> 
https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/

2 events are envisioned this summer:

1)  Survey -- discuss tribal opinions in the AGI community  as well as
published works about what "understanding" means for a machine,

2) Critiques and Conclusions -- compare, generalize, hopefully reach
some conclusions.

So what is your definition of "understanding"? I have collected about
a dozen so far and will publish along with the events.

We are also having an in person event this month for those around
western Washington:

https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/events/279258207/


Thanks Mike Archbold

--
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M68edf74ea9385a327fc8281b
Delivery options: https://agi.topicbox.com/groups/agi/subscription