he point is on later. Also, in absence of knowledge,
hardware buys you only very little.
So here's your mole of switches. What are you going to do with it?
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM:
On Sun, Jul 01, 2007 at 10:08:53AM -0700, Peter Voss wrote:
> Just for the record: To put it mildly, not everyone is 'Absolutely' sure
> that AGI can't be implemented on Bill's computer.
What is Bill's computer? A 3 PFlop Blue Gene/P? A Core 2 duo box from Dell?
> In fact, some of us are pretty
ne-grained sea of gates. Even current approaches are 3d torus of nodes
of microkernel OS, soon with FPGAs & Co.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.ativel.com http:/
uot;, you may as well
> redefine it to "go sit on the beach and read a book".
Some of them will do that. The others will go out, and spread over
all accessible spacetime.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
stuff
there is. I would put the Urban Challenge right at the
cutting edge; but I'm no AI expert.
> already tested in the lab*. A nuclear physics reaction
> is scalable; you can do the reaction with one particle
> before you try it out on a whole mass of them. An AGI
Of course it's sc
On Sun, Jul 01, 2007 at 07:01:07PM +1000, Stathis Papaioannou wrote:
> What sort of technical information, exactly, is still secret after 50 years?
The precise blueprint for a working device. Not a crude gun assembler,
the full implosion assembly mounty. The HE lens geometries, the timing,
the me
On Sun, Jul 01, 2007 at 12:45:20AM -0700, Tom McCabe wrote:
> Do you have any actual evidence for this? History has
> shown that numbers made up on the spot with no
> experimental verification whatsoever don't work well.
You need 10^17 bits and 10^23 ops to more or less accurately
represent and t
On Sun, Jul 01, 2007 at 12:47:27AM -0700, Tom McCabe wrote:
> Because an AGI is an entirely different kind of thing
> from evolution. AGI doesn't have to care what
If it's being created by an evolutionary context and
is competing with likewise it is precisely evolution,
only with a giant fitness
On Sun, Jul 01, 2007 at 11:35:21AM +1000, Stathis Papaioannou wrote:
> Why do you assume that "win at any cost" is the default around which
> you need to work?
Because these are the rules of the game for a few GYrs. Why do you
assume that these have changed?
-
This list is sponsored by AGIRI
On Sat, Jun 30, 2007 at 10:11:20PM -0400, Alan Grimes wrote:
> =\
> For the last several years, the limiting factor has absolutely not been
> hardware.
How many years? How much OPS, aggregated network and memory bandwidth?
What is your evidence for your claim?
> How much hardware do you claim y
how you win the game, but that you win the game.
Anyone who doesn't understand that is not vague, he's dead, long-term.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.
rs, stopping it would be like trying to
Add another couple decades.
> stop file sharing and software piracy.
Everyone knows how to wripte a P2P application. Nobody knows how to
build an AI. If it's a large-scale effort the knowledge can be controlled
for a long time.
--
Eugen* Leitl http
t their head in a meat-slicer
> do so and then simply walk over to the machine which is supposedly
> housing their "consciousness", and flick the power switch... :)
You forgot the dying, and the dewar part.
> End of debate. ;)
--
Eugen*
ation
there. I can't tell whether this is identity being a boolean, or substratism
(a belief that one specific medium is magickal and sacred).
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820
nology that is available (i.e. not
something which produces a circus freak).
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD
On Tue, Jun 26, 2007 at 07:14:25PM +, Niels-Jeroen Vandamme wrote:
> ? Until the planet is overcrowded with your cyberclones.
Planet, shmanet. There's GLYrs of real estate right up there.
--
Eugen* Leitl http://leitl.org";>leitl
#x27;ll be getting some considerable retrograde amnesia
in cryonics patients (assuming, the suspension can
be eventually made reversible) anyway. Nothing can be
done about it.
> solution, which is obviously a substantial change in
> brain-state.
--
Eugen* Leitl http://leitl.org";>
ore than normally living a couple of decades is an original dying,
> which most people don't count as dying. So why call it death now?
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 htt
this still halves your assets (trying to steal would not be a good
idea) each time you make a clone.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
ally, this incident should not have occurred, since the transporter
neither creates nor distroys matter, and Riker's matter stream should have only
had enough within it to produce one whole person.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
activity has
> stopped and the physical self and virtual self don't
> diverge. People do have brain activity even while
> unconscious.
But not relevant activity. If you don't know it's
there it's not relevant.
--
On Mon, Jun 25, 2007 at 08:08:17PM -0400, Jey Kottalam wrote:
> On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> >You can only transfer
> >consciousness if you kill the original.
>
> What is the justification for this claim?
Because the copies diverge, unless subject to
synchronization b
te. I hope I'm just misunderstanding things because that
Easy: just terminate one prong of fork before some 100 ms subjective
time passes. You won't feel nary a thing.
>sounds horrible.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
___
unar escape velocity is 2.38 km/s,
and it's less if you want to transfer to Earth orbit, or Earth surface
by aerobraking.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.ativel.co
On Fri, Jun 15, 2007 at 03:05:44PM -0300, Lúcio de Souza Coelho wrote:
> It seems that you are trying to equal all rare elements to gold -
No, I'm trying to point out you're extrapolating future demands
from the demands of a primitive society, us. This is guaranteed
to produce wildly inaccurate r
rake a ceramics-coated guided-descent
vehicle well enough so you can airdrop your kiloton of titanium
profiles to GPS coordinates anywhere, however, in packages of
of tons each.
You can make a km-sized rock impact Earth, if you have eno
On Sun, Jun 17, 2007 at 07:40:23PM +0100, Mike Tintner wrote:
> One of the interesting 2nd thoughts this provoked in me is the idea: what
> would it be like if you could wake each day with a new body - a totally
What would it be like to wake up embodied as a circumstellar node cloud? Uh,
it wou
On Sun, Jun 17, 2007 at 02:37:22PM -0300, Lúcio de Souza Coelho wrote:
> Given the ever-distributed nature of processing power, I would suspect
> that a superAGI would have no physical form, in the sense that it
Why is that people always use the singular? A rather strange
collective glitch in the
On Fri, Jun 15, 2007 at 09:59:13PM -0300, Lúcio de Souza Coelho wrote:
> Besides, virtual worlds will have their own resource problems in the
> real world - caused by increasing memory and processing demands.
It takes a tiny fraction of resources needed for a human mind to
render an absolutely co
On Thu, Jun 14, 2007 at 11:05:23PM -0300, Lúcio de Souza Coelho wrote:
> >Check your energetics. Asteroid mining is promising for space-based
> >construction. Otherwise you'd better at least have controllable fusion
> >rockets.
It is quite useful to utilize space recources where they are,
becau
On Tue, Jun 12, 2007 at 11:20:13PM -0400, Alan Grimes wrote:
> > I'm mostly interested in online harware which can be easily 0wn3d
> > for bootstrap purposes.
>
> d00d, if you (or I) knew how to program, your hardware needs would be
> embarasingly pathetic. =P
My programming skills are a bit rust
On Tue, Jun 12, 2007 at 03:16:30PM +0100, Russell Wallace wrote:
>Not really; TV suffices as vector for parasite memes (have seen
>empirical data to show this, though don't have reference to hand), at
There is some more diversity if you're getting your video feeds
from the grassroots, but
much more likely, for example, to be impatient
> with, than worried about, AI's rate of progress.
I don't see how that figures.
> Speculations about future AI MINUS future society (which seems to be the
> rule here) are pointless.
You sho
On Tue, Jun 12, 2007 at 06:08:07AM -0700, Chuck Esterbrook wrote:
> Don't you think that as the number of computer users increases there
> will be more interesting people writing code, solving problems,
Not proportionally, unfortunately. Gamer population would grow
overproportionally, though.
>
On Tue, Jun 12, 2007 at 01:47:04PM +0100, Mike Tintner wrote:
> Er, the point surely is - one billion computer USERS. If true, that is a v.
> big deal.
Why? They're not scientists; not even programmers. Every texting
kid with a smartphone is a computer user.
I'm mostly interested in online har
booming markets in Brazil, Russia, India and China -- the so-called
>[3]BRIC countries. Considering that the personal computer is roughly
>30 years old, that's a fairly substantial acceleration of the growth
>curve. (Thanks to [4]Slashdot for the tip.)
--
Eugen*
as that iterated interactions
between very asymmetrical players have no measurable payoffs for the
bigger player. Because of this the biosphere only gives, and the humans
only take. With bigger players than us, we only get a chance to see how
the receiving end of habitat destruction looks like. It
intelligent
> machine, so why are you judging them?
Why the animal chauvinism, sheep? You've never met an intelligent human,
so why are you judging them? Oh, wait...
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
I
et out to design a human-equivalent AI, you're
considered a crackpot on principle, so you can relax.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.
ojects do represent a lot of
> progress, we're not there yet.
Of course not -- it will take decades for a mouse, or so.
Hopefully, this will generate plenty of spinoffs for AI.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
rary when he
was still alive.
Similiar applies to the fact that appropriately structured
piece of matter can equal or rival a human-shaped piece of
matter as far as information processing is concerned.
There's no point trying to predict a time frame, though.
--
Eugen* Lei
others?
Blue Gene/L is de rigeur, since it's a nice 3d torus. I don't expect
dedicated hardware for neural emulation for a long while, since the
current mainstream has a much better ROI.
I'm not sure whether Roadrunner is suitable for neural emulation, though.
Time will tell.
le of "simulation of human mind"
> sounds like a falacy of undue amplification...
There are three projects on large scale neuronal emulation, Blue Brain
being just one of them.
And the scanning technology has been making giant advances in the
last few years -- mm^3 via TEM is qu
Atari and said, 'Hey, we've got this amazing thing,
even built with some of your parts, and what do you think about
funding us? Or we'll give it to you. We just want to do it. Pay our
salary, we'll come work for you.' And they said, 'No.' So then we
o with a lot
of computational resources for AI, don't assume the
rest of us can't.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
communicate with their 6 neighbors, so you can map
> the simulation onto a hierarchical network where most of the communication is
> local.
Do you think Google wires their nodes in a 3d torus topology?
--
Eugen* Leitl http://leitl.org&q
gt; http://en.wikipedia.org/wiki/Project_02
I recommend you read up on commodity cluster supercomputing (Beowulf).
Not every cluster is a supercomputer.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.0710
nterconnects
they sure aren't.
> human brain, but of course that is not what they are trying to do. Why would
> they want to copy human motivations?
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICB
>think it may be possible that we are required to reach a certain level
>of progress before such beings would make themselves known.
Even gods have roaches and kudzu. *These* won't wait to make themselves known.
--
Eugen* Leitl http://leitl.org&qu
ot the Turing test, then what is it? What test do you propose?
>
> Without a definition, we should stop calling it AGI and focus on the problems
> for which machines are still inferior to humans, such as language or vision.
--
Eug
brains will contain overlapping minds. A
> mailing list where I can explore these themes would be welcome.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820 http://www.ativel.com http://po
On Wed, Mar 28, 2007 at 11:37:42AM -0700, Matt Mahoney wrote:
> In a simulation, you don't need to compute it.
A highly selective simulation implies a Simulator, not
just a (meta)natural process. Now this is firmly religion country.
--
Eugen* Leitl http://leitl.org";>leitl
d on any
A proton is a damn complex system. Don't see how you could equal it with one
mere bit.
> particle properties).
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820ht
t AGI because it
> can't fall in love", or something to that effect.
Why can't an artificial system not fall in love and make babies? If it's
evolutionary, it'd damn better.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
_
On Wed, Mar 28, 2007 at 02:54:17PM +1000, Stathis Papaioannou wrote:
>Isn't this essentially the same as the Fermi argument against ET
>intelligence?
It is precisely that. Kurzweil likes to rebrand things under his own
name. A rather irritating trait, that.
--
Eugen*
On Tue, Mar 27, 2007 at 06:50:59PM -0700, Matt Mahoney wrote:
> Of course it could be that a singularity has already happened, and what you
> perceive as the universe is actually a simulation within the resulting
> superintelligence.
Is this a falsifyable theory?
--
Eugen* L
7;s plenty
of real estate. In fact, if the dark energy-driven runaway inflation
is true, a lot of what we see is out of touch forever.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820h
ecades. I expected nice things from the ALife people, but they
never got enough momentum.
> Unfortunately, this is a deep potential well we are in. It will take
> more than one renegade researcher to dig us out of it.
The hardware is
l acclaim density (using a measure theory of your choice) as
> the injection point is varied randomly in an infinite set of universes.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
ICBM: 48.07100, 11.36820
orporeal punishment, or do you settle for posthypnotic
suggestion?
> I do. If you do too then we have no disagreement.
>
> If you disagree then I must think it odd that you try so often to persuade
> me that your arguments are true.
I must admit I frequently find trouble to find argumen
appen.
Put bluntly, some theories are worse than others, as resource allocation
algorithms.
>multiverse theories, and such speculation may guide future scientific
>work. After all, the verifiability/ falsifiability principle is
>*itself* metaphysics by its own c
nfinite spacetime curvature singularities
go away in a number of TOE candidates.
>generate the mathematical Plenitude.
How can you prove that the Moon is not made from green gorgonzola,
when we're not looking?
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
___
computations, those which have observers will, as you suggest,
>self-select.
Why would they self-select? I never understood the part where the information
about which frames of the trajectory to pick came from but from some place
external. The visible universe contains a lot of bits. The
63 matches
Mail list logo