I have some thoughts, but ... isn't this discussion going to become yet another 
distraction? The question of whether AGI will result in a technological 
singularity doesn't seem to have a lot of relevance to the question of *how* to 
build AGI. So the disciples of the Singularity can believe whatever they want 
without having practical impacts on the work going forward. 
It only becomes an issue if differing opinions on the possibility of "fast 
takeoff" result in a bunch of wrangling about how much emphasis there should be 
on the Control Problem ... and that's an argument that I haven't seen come up 
on this list, yet.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T94ee05730c7d4074-M40a891a0b6a8c9e89700dfb9
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to