You forgot the newest and the simplest things too ;)

Lots of people focus on all sorts of things, many are between the atomic and 
galactic size, such as HDDs and rooftops. Lol.

...It is important we focus on the farthest questions, the simplest, the 
newest, the most important, and even seemingly unrelated problems.

But what does farthest mean? What does newest mean? Why bother if they are 
unrelated to AGI? That's correct, we want the farthest and newest and oldest 
questions only if they are about AGI. Many are important, some more priority 
than others. Simpleness can mean quick to answer and higher priority. Newest/ 
oldest/ farthest simply means to give these questions a quick check if they are 
important to answer, they may not be at all, and as for farthest - this can 
allow you to weigh in on if your theory answers all sorts of observations, can 
allow you to learn from related but diverse questions.

So in short summary recap: Physics leans towards the most important questions 
on its own. It will give fast checks to old/ new/ far/ simple/ unrelated 
questions. The brain will usually (not for some ;) ) try to learn as much as 
can about related/unrelated knowledge as quick as can, related more so.  So 
old/ new doesn't mean much, they are only checked if are fast to solve and/or 
related. Partially unrelated learning is very helpful. So, the brain looks for 
'new yummy streams' of related and somewhat related info that is fast to answer 
or verify true. Now that I have weaved on a roll here in my attentional memory 
I can say that new/ old/ far as said above was really just new info fed in and 
the brain always wants new related+somewhat unrelated info that is fast to 
answer/ verify, hence increasing the knowledgebase learning up high faster. Of 
course we want to answer how to build AGI, it's hard, the brains stay locked on 
the hard problem, but they can get they in polynominal time by stepping the 
right directions down the butterfly tree effect like AlphaZero using self-play 
learning on advanced GPUs made by Nividea. So this means the brain wants to 
tackle the hard, evolution-installed goals of self preservation to death de 
part, and it takes baby steps as fast as can. Usually many want just a house 
and a lover, it's not hard to make a life, it's hard to advance evolution 
though. But these are the smarter ones, they are seeking full preservation, and 
its hard. And the brain allows you to get there fast using baby steps, even 
though it is one of the hardest problems. I say one of because AGI is simpler 
than a global ASI hive or omega structure nebula. We tackle AGI because it is 
the direct easiest step to become immortal near instantly. But its hard. But 
its the best weighed in route. So the brain is taking the best fastest route to 
get what, it wants. And in that process, it is taking sub steps that are fast 
to learn or build in R&D, to get to the AGI goal.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tb261469785facfd2-M804ad41c3f1c32c3e87e8775
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to