Re: [agi] Yours truly, the world's brokest researcher, looks for a bit of credit

2019-03-07 Thread Mike Archbold
How many of you AGI researchers are driving old cars? You know, you have to add fluids at every fuel stop (oil and coolant leaking) and depend on the radio to drown out rattles and clunks? On 3/7/19, Stefan Reich via AGI wrote: > Not from you guys necessarily... :o) But I thought I'd let you know

Re: [agi] Yours truly, the world's brokest researcher, looks for a bit of credit

2019-03-07 Thread Robert Levy
It's very easy to show that "AGI should not be designed for NL". Just ask yourself the following questions: 1. How many species demonstrate impressive leverage of intentional behaviors? (My answer would be: all of them, though some more than others) 2. How many species have language (My answer:

Re: [agi] Yours truly, the world's brokest researcher, looks for a bit of credit

2019-03-07 Thread Matt Mahoney
Actually the "I ate pizza with {a fork|pepperoni|Bob}" example in your slides is mine. But you can credit Doug Lenat with "The police arrested the demonstrators because they {feared|advocated} violence". NLP is not AGI but it is an important component of it. It's a good place to start. But you rea

Re: [agi] An Experiment

2019-03-07 Thread Steve Richfield
Boris, I would like to introduce your AGI to a magician friend of mine. Steve On Thu, Mar 7, 2019, 12:05 Boris Kazachenko wrote: > "But why would you think that AGI would not hallucinate?" > > Your "AGI" may hallucinate, because it is designed to feed on that > incoherent second-hand natural-

Re: [agi] Yours truly, the world's brokest researcher, looks for a bit of credit

2019-03-07 Thread Boris Kazachenko
I would be more than happy to pay: https://github.com/boris-kz/CogAlg/blob/master/CONTRIBUTING.md , but I don't think you are working on AGI. No one here does, this is a NLP chatbot crowd. Anyone who thinks that AGI should be designed for NL data as a primary input is profoundly confused. On Thu,

Re: [agi] An Experiment

2019-03-07 Thread Boris Kazachenko
"But why would you think that AGI would not hallucinate?" Your "AGI" may hallucinate, because it is designed to feed on that incoherent second-hand natural-language data. Mine won't, it is designed to be integral and self-sufficient. It will believe what it sees, not what a bunch of nuts on the ne

[agi] Seeing through another's eyes

2019-03-07 Thread keghnfeem
Seeing through another's eyes: https://www.sciencedaily.com/releases/2019/02/19022704.htm -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T64ee0af6b97d9a8e-Mf51dd317b214919ca6a71795 Delivery options: https:/

[agi] Yours truly, the world's brokest researcher, looks for a bit of credit

2019-03-07 Thread Stefan Reich via AGI
Not from you guys necessarily... :o) But I thought I'd let you know. Pitch: https://www.meetup.com/Artificial-Intelligence-Meetup/messages/boards/thread/52050719 Let's see if it can be done... funny how some hurdles always seem to appear when you're about to finish something good. Something about