No no no, the proposal above was that even when you opt to use the lossy 
compressor and decompress to retrieve "2+2=" back (with missing data), this s 
like a lot of things in life, employees die and go missing, neurons die, but we 
just fill in the cracks anyway. On the global aspect, Earth would fill in or 
re-make the whole file some point later in time anyway. Of course for the 
Hutter Prize, you'll need an instant, self-contained solution to re-generate 
the original data file back. So Lossless Compression is the correct way to 
think about the Hutter prize yes, it is local and faster and therefore testable.
But wait global context is faster no? Only near the end of the singularity it 
is. The more words/medics/etc the better you can repair something, or IOW 
faster. So in the future a human who lost organs (data) can be repaired faster 
if we use a lot of doctors instead of 1 doctor, this is same as decompression, 
you can have a lossless file and the data is also redundantly around town - or 
you can leave pieces missing and the data is still around town, but in this 
example case all the doctors are ready and quick....
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T36c83eb0aa31fc55-M2e7793860a1264defec5a638
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to