Ben,
Mapping RRA to Hegel's space isn't trivial, but here goes...
On 11/19/08, Ben Goertzel [EMAIL PROTECTED] wrote:
I have nothing against Hegel; I think he was a great philosopher. His
Logic is really fantastic reading. And, having grown up surrounded by
Marxist wannabe-revolutionaries
--- On Tue, 11/18/08, Mark Waser [EMAIL PROTECTED] wrote:
add-rule kill-file Matt Mahoney
Mark, whatever happened to that friendliness-religion you caught a few months
ago?
Anyway, with regard to grounding, internal feedback, and volition, autobliss
already has two of these three properties,
On Tue, Nov 18, 2008 at 11:23 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
Steve, what is the purpose of your political litmus test? If you are trying
to assemble a team of seed-AI programmers with the correct ethics, forget
it. Seed AI is a myth.
http://www.mattmahoney.net/agi2.html (section 2).
--- On Wed, 11/19/08, Daniel Yokomizo [EMAIL PROTECTED] wrote:
On Tue, Nov 18, 2008 at 11:23 PM, Matt Mahoney
[EMAIL PROTECTED] wrote:
Seed AI is a myth.
http://www.mattmahoney.net/agi2.html (section 2).
(I'm assuming you meant the section 5.1.
Recursive Self Improvement)
That too, but
BTW, for those who are newbies to this list, Matt's argument attempting to
refute RSI was extensively discussed on this list a few months ago.
In my view, I refuted his argument pretty clearly, although he does not
agree.
His mathematics is correct, but seemed to me irrelevant to real-life RSI
On Wed, Nov 19, 2008 at 1:21 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
--- On Wed, 11/19/08, Daniel Yokomizo [EMAIL PROTECTED] wrote:
On Tue, Nov 18, 2008 at 11:23 PM, Matt Mahoney
[EMAIL PROTECTED] wrote:
Seed AI is a myth.
http://www.mattmahoney.net/agi2.html (section 2).
(I'm assuming
Ben:
On 11/18/08, Ben Goertzel [EMAIL PROTECTED] wrote:
This sounds an awful lot like the Hegelian dialectical method...
Your point being?
We are all stuck in Hegal's Hell whether we like it or not. Reverse Reductio
ad Absurdum is just a tool to help guide us through it.
There seems to be
Back to reality for a moment...
I have greatly increased the IQs of some pretty bright people since I
started doing this in 2001 (the details are way off topic here, so contact
me off-line for more if you are interested), and now, others are also doing
this. I think that these people give us a
--- On Wed, 11/19/08, Daniel Yokomizo [EMAIL PROTECTED] wrote:
I just want to be clear, you agree that an agent is able to create a
better version of itself, not just in terms of a badly defined measure
as IQ but also as a measure of resource utilization.
Yes, even bacteria can do this.
Do
To all,
I am considering putting up a web site to filter the crazies as follows,
and would appreciate all comments, suggestions, etc.
Everyone visiting the site would get different questions, in different
orders, etc. Many questions would have more than one correct answer, and in
many cases,
Hi Steve
I am not an expert so correct me if I am wrong. As I see it every day
logical arguments (and rationality?) are based on standard classical logic
(or something very similar). Yet I am (sadly) not aware of a convincing
argument that this logic is the one to accept as the right choice. You
2008/11/18 Steve Richfield [EMAIL PROTECTED]:
I am considering putting up a web site to filter the crazies as follows,
and would appreciate all comments, suggestions, etc.
This all sounds peachy in principle, but I expect it would exclude
virtually everyone except perhaps a few of the most
On Tue, Nov 18, 2008 at 8:38 PM, Bob Mottram [EMAIL PROTECTED] wrote:
I think most people have at least a few beliefs which cannot be strictly
justified rationally
You would think that. :)
Trent
---
agi
Archives:
3. A statement in their own words that they hereby disavow allegiance
to any non-human god or alien entity, and that they will NOT follow the
directives of any government led by people who would obviously fail this
test. This statement would be included on the license.
Hmmm... don't I fail
Steve Richfield wrote:
To all,
I am considering putting up a web site to filter the crazies as
follows, and would appreciate all comments, suggestions, etc.
Everyone visiting the site would get different questions, in different
orders, etc. Many questions would have more than one correct
On Tue, Nov 18, 2008 at 1:22 PM, Richard Loosemore wrote:
I see how this would work: crazy people never tell lies, so you'd be able
to nail 'em when they gave the wrong answers.
Yup. That's how they pass lie detector tests as well.
They sincerely believe the garbage they spread around.
Martin,
On 11/18/08, martin biehl [EMAIL PROTECTED] wrote:
I don't know what reverse reductio ad absurdum is, so it may not be a
precise counterexample, but I think you get my point.
HERE is the crux of my argument, as other forms of logic fall short of being
adequate to run a world with.
This sounds an awful lot like the Hegelian dialectical method...
ben g
On Tue, Nov 18, 2008 at 5:29 PM, Steve Richfield
[EMAIL PROTECTED]wrote:
Martin,
On 11/18/08, martin biehl [EMAIL PROTECTED] wrote:
I don't know what reverse reductio ad absurdum is, so it may not be a
precise
Bob,
On 11/18/08, Bob Mottram [EMAIL PROTECTED] wrote:
2008/11/18 Steve Richfield [EMAIL PROTECTED]:
I am considering putting up a web site to filter the crazies as
follows,
and would appreciate all comments, suggestions, etc.
This all sounds peachy in principle, but I expect it would
Ben,
On 11/18/08, Ben Goertzel [EMAIL PROTECTED] wrote:
3. A statement in their own words that they hereby disavow allegiance
to any non-human god or alien entity, and that they will NOT follow the
directives of any government led by people who would obviously fail this
test. This
Richard and Bill,
On 11/18/08, BillK [EMAIL PROTECTED] wrote:
On Tue, Nov 18, 2008 at 1:22 PM, Richard Loosemore wrote:
I see how this would work: crazy people never tell lies, so you'd be
able
to nail 'em when they gave the wrong answers.
Yup. That's how they pass lie detector tests as
Could we please stick to discussion of AGI?
-Ben
From: Steve Richfield [mailto:[EMAIL PROTECTED]
Sent: Wednesday, 19 November 2008 10:39 AM
To: agi@v2.listbox.com
Subject: Re: [agi] My prospective plan to neutralize AGI and other dangerous
technologies...
Richard and Bill,
On 11
Richfield [EMAIL PROTECTED] wrote:
From: Steve Richfield [EMAIL PROTECTED]
Subject: Re: [agi] My prospective plan to neutralize AGI and other dangerous
technologies...
To: agi@v2.listbox.com
Date: Tuesday, November 18, 2008, 6:39 PM
Richard and Bill,
On 11/18/08, BillK [EMAIL PROTECTED] wrote:
On Tue
: Tuesday, November 18, 2008 8:23 PM
Subject: **SPAM** Re: [agi] My prospective plan to neutralize AGI and other
dangerous technologies...
Steve, what is the purpose of your political litmus test? If you are
trying to assemble a team of seed-AI programmers with the correct ethics,
forget
has, that he would question that goal.
Thanks everyone for your comments.
Steve Richfield
=
--- On *Tue, 11/18/08, Steve Richfield [EMAIL PROTECTED]*wrote:
From: Steve Richfield [EMAIL PROTECTED]
Subject: Re: [agi] My prospective plan to neutralize AGI and other
dangerous
25 matches
Mail list logo