There is the possibility that those in power worldwide might decide to create 2 classes of citizens, those chosen to participate in the singularity and those left behind.

Humanity has a certain dog in the manger , elitist tendency that come out at the worst of times.

The dividing line might be economic, social, religeous, and welfare-state  mediated.

So, I'm suggesting as a counter argument that a developing AI might work in concert with
those able and willling to pay, with  acceptable racial, social, religeous  and political  mores
to rationaize  the cost / ration the benefits of  implementation .

In turn some leading edge technology might be allowed to be forbidden by laws and only available
within a black market economy accessible only to the chosen elite.

The singularity might not be evenly distributed...?

Take the stem cell senario and the global drug laws as examples.
One can make spin doctored laws that have secret second adjendas.

I surreal argument, but these sort of events have happened in history before.

Morris

On 10/4/06, Joshua Fox <[EMAIL PROTECTED]> wrote:
Could I offer Singularity-list readers this intellectual challenge: Give an argument supporting the thesis "Any sort of Singularity is very unlikely to occur in this century."
 
Even if you don't actually believe the point, consider it a debate-club-style challenge. If there is already something on the web somewhere, could you please point me to it.

I've been eager for this piece ever since I learned of the Singularity concept.  I know of  the "objections" chapter in Kurzweil's Singularity is Near, the relevant parts of Vinge's seminal essay, as well the ideas of Lanier, Huebner, and a few others, but in all the millions of words out there I can't remember seeing a well-reasoned article with the above claim as its major thesis.  (Note, I'm looking for "why the Singularity won't happen" rather than "why the Singularity is a bad idea" or "why technology is not accelerating".)


Joshua

This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]


This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to