Oops, can't do math either.  That should be:

1/H = hG/(c^5 T^2) = 1.55e-122 (forgot the - )
H = 6.45e121
 
-- Matt Mahoney, [EMAIL PROTECTED]

----- Original Message ----
From: Matt Mahoney <[EMAIL PROTECTED]>
To: [email protected]
Sent: Thursday, October 26, 2006 1:53:47 PM
Subject: Re: [singularity] Convincing non-techie skeptics that the Singularity 
isn't total bunk

I found more on Freitas' SQ
http://en.wikipedia.org/wiki/Sentience_Quotient

The ratio of the highest and lowest values, 10^120 depends only on Planck's 
constant h, the speed of light c, the gravitational constant G, and the age of 
the universe, T (which is related to the size and mass of the universe by c and 
G).  This number is also the quantum mechanical limit on the entropy of the 
universe, or the largest memory you could build, about 10^120 bits.  Let me 
call this number H.  A more precise calculation shows

h = 1.054e-34 Kg m^2/s  (actually h-bar)
c = 3.00e8 m/s
G = 6.673e-11 Kg m^3/s^2
T = 4.32e17 s (13.7 billion years)
H = hG/(c^5 T^2) = 1.55e122 (unitless)

although I am probably neglecting some small but important constants due to my 
crude attempt at physics.  I derived H by nothing more than cancelling out 
units.

If this memory filled the universe (and it would have to), then each bit would 
occupy about the space of a proton or neutron.  This is quite a coincidence, 
since h, G,  c, and T do not depend on the physical properties of any 
particles.  The actual number of baryons (protons and neutrons and possibly 
their antiparticles) in the universe is about H^(2/3) ~ 10^80.  If the universe 
was mashed flat, it would form a sheet of neutrons one particle thick.

Another possible coincidence is that H could be related to the fine structure 
constant alpha = 1/137.0359997... by H ~ e^2/alpha ~ 10^119.  If this could be 
confirmed, it would be significant because alpha is known to about 9 
significant digits.  Alpha is unitless and depends on h, c, and the unit 
quantum electric charge.
http://en.wikipedia.org/wiki/Fine_structure_constant
 
-- Matt Mahoney, [EMAIL PROTECTED]

----- Original Message ----
From: Kaj Sotala <[EMAIL PROTECTED]>
To: [email protected]
Sent: Thursday, October 26, 2006 9:46:55 AM
Subject: Re: [singularity] Convincing non-techie skeptics that the Singularity 
isn't total bunk

On 9/24/06, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> Anyway, I am curious if anyone would like to share experiences they've
> had trying to get Singularitarian concepts across to ordinary (but
> let's assume college-educated) Joes out there.  Successful experiences
> are valued but also unsuccessful ones.  I'm specifically interested in

Personally, I've noticed that the opposition to a thought of
Singularity falls into two main camps:

1) Sure, we might get human-equivalent hardware in the near future,
but we're still nowhere near having the software for true AI.

2) We might get a Singularity within our lifetimes, but it's just as
likely to be a rather soft takeoff and thus not really *that* big of
an issue - life-changing, sure, but not substantially different from
the development of technology so far.

The difficulty with arguing against point 1 is that, well, I don't
know all that much that'd support me in arguing against it. I've had
some limited success with quoting Kurzweil's "brain scanning
resolution is constantly getting better" graph and pointing out that
we'll become able of doing a brute-force simulation at some point, but
as for anything more elegant, not much luck.

Moore's Law seems to work somewhat against point 2, but people often
question how long we can assume it to hold.

> approaches, metaphors, focii and so forth that have actually proved
> successful at waking non-nerd, non-SF-maniac human beings up to the
> idea that this idea of a coming Singularity is not **completely**
> absurd...

Myself, I've recently taken a liking to the Venus flytrap metaphor I
stole from Robert Freitas' Xenopsychology. To quote my in-the-works
introductory essay to the Singularity (yes, it seems to be
in-the-works indefinitely - short spurts of progress, after which I
can't be bothered to touch it for months at a time):

"In his 1984 paper Xenopsychology [3], Robert Freitas introduces the
concept of Sentience Quotient for determining a mind's intellect. It
is based on the size of the brain's neurons and their
information-processing capability. The dumbest possible brain would
have a single neuron massing as much as the entire universe and
require a time equal to the age of the universe to process one bit,
giving it an SQ of -70. The smartest possible brain allowed by the
laws of physics, on the other hand, would have an SQ of +50. While
this only reflects pure processing capability and doesn't take into
account the software running on the brains, it's still a useful rough
guideline.

So what's this have to do with artificial intelligences? Well, Freitas
estimates Venus flytraps to have an SQ of +1, while most plants have
an SQ of around -2. The SQ for humans is estimated at +13. Freitas
estimates electronic sentiences that can be built to have an SQ of +23
- making the difference of us and advanced AIs <i>nearly as high as
between humans and Venus flytraps</i>. It should be obvious that when
compared to this, even the smartest humans would stand no chance
against the AI's intellect - any more than we should be afraid of a
genius carnivorous plant suddenly developing a working plan for taking
over all of humanity."

http://www.saunalahti.fi/~tspro1/Esitys/009.png has the same
compressed in a catchy presentation slide (some of the text is in
Finnish, but you ought to get the gist of it anyway).

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to