***
All of the written knowledge on the internet was either typed in or
spoken at some point. It still makes up less than 1% of the human
knowledge that an AGI would need to model the economy, to know what
you want without having to explicitly ask for it. You don't have a
robot that will clean your house because it wouldn't know whether a
magazine on the floor belongs on the table or in the trash. In the
time it takes you to tell it, you could have picked it up yourself. It
doesn't matter how smart it is. It's how fast you can communicate the
10^7 bits of human knowledge in your brain that nobody knows except
you.
***

You are assuming that willful, focused "communication" is needed to get
the info in your head across to an AGI

But an AGI, having read what you've posted online and watched videos
of you doing your thing in real life, may be able to infer the relevant
knowledge in your head using its powers of abductive reasoning...

A lot of human learning already happens this way rather than via
explicit instruction -- but individual humans don't have the ability to observe
billions of people in parallel to feed their inference engines

So the explicit instruction of AGIs you're alluding to may not need to
ever happen

ben

On Sun, Feb 10, 2019 at 3:58 AM Linas Vepstas <linasveps...@gmail.com> wrote:
>
>
>
> On Sat, Feb 9, 2019 at 4:22 AM Ben Goertzel <b...@goertzel.org> wrote:
>>
>>
>> We are now playing with hybridizing these symbolic-ish grammar
>> induction methods with neural net language models, basically using the
>> predictive models produced by models in the BERT lineage (but more
>> sophisticated than vanilla BERT) in place of simple mutual information
>> values to produce more broadly-context-sensitive parse choices in
>> Linas's MST parser...
>
>
> This last sentence suggests that the near-total confusion about MST continues 
> to persist in the team. I keep telling them to collect the statistics, and 
> then discard the MST parse **immediately**. Trying to "improve" MST is a 
> total waste of time.
>
> Seriously: Instead, try skipping the MST step entirely.  Just do not even do 
> it, AT ALL. Rip it out. It is NOT a step that the algorithm even needs.  I'll 
> bet you that if you skip the MST step completely, the quality of your results 
> will be more-or-less unchanged.  The results might even get better!
>
> If your results don't change, by skipping MST, or if your results get better, 
> by skipping MST, then that should be a clear indicator that trying to 
> "improve" MST is a waste of time!
>
> -- Linas
>
> --
> cassette tapes - analog TV - film cameras - you
> Artificial General Intelligence List / AGI / see discussions + participants + 
> delivery options Permalink



-- 
Ben Goertzel, PhD
http://goertzel.org

"The dewdrop world / Is the dewdrop world / And yet, and yet …" --
Kobayashi Issa

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta6fce6a7b640886a-Mb23a5923c0b8f89dc5bb5791
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to