Matt,

Perhaps your are right.  

But one problem is that big Google-like compuplexes in the next five to ten
years will be powerful enough to do AGI and they will be much more efficient
for AGI search because the physical closeness of their machines will make it
possible for them to perform the massive interconnected needed for powerful
AGI much more efficiently.

Ed Porter

-----Original Message-----
From: Matt Mahoney [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, December 04, 2007 9:18 PM
To: agi@v2.listbox.com
Subject: RE: Hacker intelligence level [WAS Re: [agi] Funding AGI research]


--- Ed Porter <[EMAIL PROTECTED]> wrote:

> >MATT MAHONEY=====> My design would use most of the Internet (10^9 P2P
> nodes).
> ED PORTER=====> That's ambitious.  Easier said than done unless you have a
> Google, Microsoft, or mass popular movement backing you.

It would take some free software that people find useful.  The Internet has
been transformed before.  Remember when there were no web browsers and no
search engines?  You can probably think of transformations that would make
the
Internet more useful.  Centralized search is limited to a few big players
that
can keep a copy of the Internet on their servers.  Google is certainly
useful,
but imagine if it searched a space 1000 times larger and if posts were
instantly added to its index, without having to wait days for its spider to
find them.  Imagine your post going to persistent queries posted days
earlier.
 Imagine your queries being answered by real human beings in addition to
other
peers.

I probably won't be the one writing this program, but where there is a need,
I
expect it will happen.


> In a message passing network, the critical parameter is the ratio of
> messages
> out to messages in.  The ratio cannot exceed 1 on average.
> ED PORTER=====> Thanks for the info.  By "unmaintainable" what do you
mean?
> 
> I don't understand why more messages coming in than going out creates a
> problem, unless most of what nodes do is relay message, which is not what
> they do in my system.

I meant the other way, which would flood the network with duplicate
messages. 
But I believe the network would be stable against this, even in the face of
spammers and malicious nodes, because most nodes would be configured to
ignore
duplicates and any messages that it deemed irrelevant.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=72151542-9bffdb

Reply via email to