-Caveat Lector- On 20 May 1999 13:35:15 GMT, in alt.paranet.skeptic "Luminator" <[EMAIL PROTECTED]> wrote: "Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended". --Vernor Vinge, The Singularity Perhaps some of you already know about Vinge's article (it is widely available online), but have you ever considered the true implications of a technological Singularity in the relatively near future? Human civilization will almost certainly end, one way or another, within the lifetime of most people who read this. Either we destroy ourselves accidentally or due to warfare, or we'll get superseded and made obsolete by our own creations, the AIs. Or...we'll redesign ourselves, merge with our machines and become potentially immortal, near- omnipotent, auto-evolving superbeings. This last scenario would require an active effort, of course, as the odds are against us (if current trends in computing continue, which is plausible, AIs will likely achieve superhuman intelligence before we do). If you're interested in such an effort, you may want to check out the following web page: http://meltingpot.fortunecity.com//kuwait/557/index.html If you think this sounds like silly science fiction, think again. The technologies which will power the Singularity are already well under development (see the links at the site). It is a virtually *inevitable* result of progress, a logical consequence of ever increasing complexity. To think that the future will be something like The Jetsons, Star Trek or even Blade Runner is extremely naive. The Terminator may be more on the mark, though in reality the machines would soon be infinitely more advanced than humans, and would have no problem with getting rid of them. No fancy schemes involving time travel needed for that. Something like The Matrix? By no means radical (or pessimistic, from the human pov) enough. In reality, the future will be so radically different due to the influence of superintelligence, that we can't, by definition, make any acurate predictions about it. One thing is certain though: if mankind becomes obsolete, it is hard to imagine that its future will be bright. As we are all under a biological death sentence and therefore can't afford to stop progress (if we, as individuals, want to survive), there is only one rational course of action: try to *become* the Singularity (by becoming superintelligent, preferable by means of mind uploading), and thus take matters into your own hands. As such a project would obviously be too difficult for just one person, cooperation will be required to gather the necessary wealth and technology to transcend. See the website for more info (on how to get involved, etc.). ** try SkeptiChat, a mail.list of irreverence & irrelevance, etc ** * with SkeptiNews: All The News That's Fit To Question (Pat.Pend) * * email INFO or SUBSCRIBE SKEPTICHAT to [EMAIL PROTECTED] * DECLARATION & DISCLAIMER ========== CTRL is a discussion and informational exchange list. Proselyzting propagandic screeds are not allowed. Substance—not soapboxing! These are sordid matters and 'conspiracy theory', with its many half-truths, misdirections and outright frauds is used politically by different groups with major and minor effects spread throughout the spectrum of time and thought. That being said, CTRL gives no endorsement to the validity of posts, and always suggests to readers; be wary of what you read. CTRL gives no credeence to Holocaust denial and nazi's need not apply. Let us please be civil and as always, Caveat Lector. ======================================================================== Archives Available at: http://home.ease.lsoft.com/archives/CTRL.html http:[EMAIL PROTECTED]/ ======================================================================== To subscribe to Conspiracy Theory Research List[CTRL] send email: SUBSCRIBE CTRL [to:] [EMAIL PROTECTED] To UNsubscribe to Conspiracy Theory Research List[CTRL] send email: SIGNOFF CTRL [to:] [EMAIL PROTECTED] Om