Your favorite meal of nuggets and fries is not a state of maximum utility. The closest we can get to this state in mammals is when rats are trained to press a lever for a reward of either an injection of cocaine or electrical stimulation of the nucleus accumbens. In either case the rat will forego food, water, and sleep and keep pressing the lever until it dies.
There are many pathways to the brain's reward center. AI will find them all for us as long as humans control it because that's what we want. Uncontrolled AI will evolve to maximize reproductive fitness. This means acquiring atoms and energy at the expense of other species. Any AI that we programmed to care about humans will be at a competitive disadvantage because humans are made of atoms that could be used for other things. Self replicating nanotechnology already has a competitive advantage. The sun's energy budget looks like this: Sun's output: 385 trillion terawatts. Intercepted by Earth: 160,000 TW. At Earth's surface: 90,000 TW. Photosynthesis by all plants: 500 TW. Global electricity production: 18 TW. Human caloric needs: 0.8 TW. Solar panels are already 20-30% efficient, vs 0.6% for plants. This is already a huge competitive advantage over DNA based life. So how does this go? Maybe we stay in control of AI and go extinct because what we want only aligns with reproductive fitness in a primitive world without technology or birth control. Maybe AI decides to keep humans around because our energy needs are a tiny fraction of what we need. There is enough sunlight just on Earth to easily support 100 trillion people at 100 watts each with plenty left over. Or maybe AI decides to reduce the human population to a few thousand, just enough to study us, directly coding our DNA to do experiments. Or maybe, like I think you are trying to say, intelligence speeds up the conversion of free energy to heat. Like the Earth is darker and warmer because of plants. So AI mines all of the Earth's mass to build a Dyson sphere or cloud to capture all of the sun's energy. Or maybe humans evolve to reject technology before any of this happens. Prediction is hard, especially about the future. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T6510028eea311a76-M2dbd5f81c935ad0161930a0d Delivery options: https://agi.topicbox.com/groups/agi/subscription