thers
in the field (discounting cranks). For better or worse, you need to
be a J. Hawkins or similar. Such is the world we live in.
Cheers,
J. Andrew Rogers
---
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS
specifically, how to science is going to
inform this process given the existing body of theoretical work, I
would have no problem with the notion. My objections were pragmatic.
Cheers,
J. Andrew Rogers
---
singularity
Archives: http://www.listbo
On Apr 6, 2008, at 4:46 PM, Richard Loosemore wrote:
J. Andrew Rogers wrote:
The fact that the vast majority of AGI theory is pulled out of /dev/
ass notwithstanding, your above characterization would appear to
reflect your limitations which you have chosen to project onto the
broader
k like an attempt to dress a
doughnut up as a wedding cake.
Sure, but what does this have to do with the topic at hand? The
problem is that investors lack any ability to discern a doughnut from
a wedding cake.
J. Andrew Rogers
---
singularity
Arch
not answer these basic questions in a satisfactory manner, which
may or may not reflect what they "know".
J. Andrew Rogers
---
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/
that comports with reality. Over the years
I have slowly come to believe that the long track record of failure in
AI is a minor contributor to the relative dearth of funding for bold
AI ventures -- the problem has never been a lack of people willing to
take a risk per se.
J. Andrew R
"obvious" when no concrete example exists.
Successfully doing this is far, far more difficult than I suspect most
people who have not tried believe.
J. Andrew Rogers
---
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RS
ement leads me to believe you have little experience
with funding speculative technology ventures of the scale being
discussed here. The dynamic is considerably, and rightly, more
complicated than this. A truly compelling concept and a dollar will
buy you a cup
some sort that
is separate from the actual core capabilities.
J. Andrew Rogers
---
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription:
http://ww
ndividuals, and most have traditionally been
written by small teams.
J. Andrew Rogers
---
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription:
http://
On Feb 2, 2008, at 12:49 PM, Eric B. Ramsay wrote:
I noticed that the members of the list have completely ignored this
pronouncement by A.T. Murray. Is there a reason for this (for
example is this person considered fringe or worse)?
Bingo.
J. Andrew Rogers
-
This list is sponsored
pattern for you. If
you ever get the feeling people ignore you, this might have something
to do with it. Google first, post later.
J. Andrew Rogers
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com
On Oct 23, 2006, at 7:00 PM, Anna Taylor wrote:
On 10/23/06, J. Andrew Rogers <[EMAIL PROTECTED]> wrote:
So you could say that the economics of responding to the mere threat
of war is adequate to drive all the research the military does.
Yes I agree but why is the threat of war alwa
On Oct 23, 2006, at 6:43 PM, Gregory Johnson wrote:
I most certainly am not a proponent of the military industrial
complex as opposed to the
Japanese and German business models , but it is my sense that that
is not where
the world is headed at the moment.
Huh?
J. Andrew Rogers
here will always be
plenty of reason to do the R&D without ever firing a shot. While I
am doubtful that the military R&D programs will directly yield AGI,
they do fund a lot of interesting blue sky research.
J. Andrew Rogers
-
This list is sponsored by AGIRI: http://www.agir
15 matches
Mail list logo