I might suggest Resource Description Framework (RDF). 
I cann't speak for it, but it seems to be pushed by
O'reilly and W3C.

http://www.xml.com/pub/a/2001/01/24/rdf.html
http://www.w3.org/RDF/
http://ilrt.org/discovery/

Looks like they've got some open source supporting it
on multiple platforms including java.  The whole
metadata area may not be completely ripe yet.  There
seems to be a lot of work being done still.  I think I
remember reading a paper about tring to distribute
searchs through metadata.  That's probably way outside
the scope of freenet anyway.  I like the idea of the
user combining a few trusted indexies.

I think google updates it's search data every 3 months
or so.  So I imagin making a fresh index every 3
months with maybe some patches in between.

Question: how does the indexer find out about the
sites to bulid an index?  Was I right about nodes in
being able sniff the public keys of freesits?  Maybe
the indexer could just request SSK@<public
key>/<metaformatname>.  If he cann't find it he could
just assume the site didn't want to be indexed.  I
guess the other alternative would be sending a  NIM to
the the indexer.

Chris

 --- michael <[EMAIL PROTECTED]> schrieb: > Toad
<[EMAIL PROTECTED]> writes:
> 
> > On Tue, Jun 24, 2003 at 11:27:41AM -0700, michael
> wrote:
> >> 
> >> So what we someone came up with a tool to do this
> kind of searching
> >> and indexing, and the query of the index could be
> run on your local
> >> node? The indexes themselves could be distributed
> as content within
> >> freenet (so you're not compromising anonymity by
> using them) and since
> >> the indexes are inserted under a certain key,
> they'd be no more or
> >> less vulnerable to poisoning than trusting one of
> the existing index
> >> sites. Instead of going to tfe's page each day,
> you'd grab your search
> >> index from tfe and perform your keyword search on
> your local node.
> >> Am I missing something obvious here?
> >
> > That is the way to go, when things get big. More
> convenient than the
> > sites perhaps, but not more secure. Although you
> could combine several
> > anonymously inserted indexes using one client. Oh,
> and it's likely to be
> > pretty slow unless the indexes are so small that
> you can fetch the whole
> > thing every day.
> 
> Combining multiple indexes into one search client
> makes a lot of
> sense. It would lower susceptibility to a particular
> indexers bias
> (either intentional or algorithmic) and it would
> help with the
> "indexes get big" problem if you could fetch large
> indexes once
> (in a while) and incremental indexes most of the
> time. 
> 
> Now all we need is an small, easily mergeable,
> quickly searchable,
> standardized index format... and the tools to go
> with it. :)
> 
> -michael
> _______________________________________________
> devl mailing list
> [EMAIL PROTECTED]
> http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

__________________________________________________________________

Gesendet von Yahoo! Mail - http://mail.yahoo.de
Logos und Klingelt�ne f�rs Handy bei http://sms.yahoo.de
_______________________________________________
devl mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to