Most importantly immortality approx. We have only a few years before death,
hurry and help me with Transformers. Most human machines hate pain/death
near-sightingly and clearly would love eternity. Darwinian survival prunes
these machines off.
--
Artificia
We'll have larger/better homes, dinners, games, safety, bodies, etc and new RL
rewards like carbon dioxide, touching walls, etc.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T7cbcba9a1ae63532-M8b00fe53499c11a43
the singularity doesnt increase the quala of life.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T7cbcba9a1ae63532-M4b22f81db2b33f93e736bf9b
Delivery options: https://agi.topicbox.com/groups/agi/subscription
The internal RL nodes update/spread reward dye to update the pop up agenda
questions. It's working memory activations / dyed nodes are activated and
attended to as context on the mind.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.top
I see a few of yous back here can't understand this:
https://agi.topicbox.com/groups/agi/T2e5182d7ce6527f7
Ya, we love stuff and 'see it', Quala, but head to toe and all-we-can-touch (or
be touched by) is machine, like a iphone or rock. And a rock is no different
but I fight for life. Remember I
I only have had someone code me my gpt2 replica, my design is yet to be
implemented.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T7cbcba9a1ae63532-M8745dbcd151125a93fad03a0
Delivery options: https://agi.topicb
why don't you give your agi it's own websight so that people can interact
with it?
On Fri, Dec 27, 2019 at 11:06 AM wrote:
> Storing, Forgetting, and Recall. Attention does all. Which sense or where
> to pay attention to. How much attention. To recognize it. To choose the
> Next Feature by looki
Storing, Forgetting, and Recall. Attention does all. Which sense or where to
pay attention to. How much attention. To recognize it. To choose the Next
Feature by looking at story words and which to ignore (forget). To Adjust the
Next Feature. The nodes that store/recall are attentive / pop up as
Let me know if you want more, if no one says so then I assume this may have not
done much good.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T7cbcba9a1ae63532-Md97866479390e35271769307
Delivery options: https:/
Think if you had a GPT-2 prompt 1 word long only ex. scratching, it could be
seen as scratch ing, so let's say we had only scratch, the word to predict
next is the one seen before in the data, and the frequent one is more likely
chosen. This word scratch can also match related words itch scrap
Attention types: Decides which sense to pay attention to ex. buzzing noising or
pop up goals, where/how wide to window on that sense, and which Next Word to
predict using more attention.
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.t
The hierarchy can self-organize to lower node count Cost (error) by
re-arranging the connections that exist, to learn Byte Pair Encoding
(segmentation) on the fly too, not just translation or sequence-building on the
fly. You don't have to look at all areas/layers of the hierarchy.
-
I think you can skip the heterarchy maybesimply the hierarchy nodes get
activated ex. nodes cat, etc, which parallelly leaks energy to their nearby
context ex. 'the cat ate' 'the cat ran' 'our cat went' and these handles leak
energy to nearby context 'the dog ate' 'the dog ran' 'some dog wen
https://www.youtube.com/watch?v=Mah0Bxyu-UI&t=2s
--
Artificial General Intelligence List: AGI
Permalink:
https://agi.topicbox.com/groups/agi/T7cbcba9a1ae63532-M60883f60c76e778806700f05
Delivery options: https://agi.topicbox.com/groups/agi/subscription
14 matches
Mail list logo