According to Dominik Marti:
> I'm currently developing an application that automatically searches for
> articles on remote sites. Registered users have stored their
> emailadresses, what sites should be searched and what are the key words
> that the search engine should look for. These information are stored in
> a database. Finally, found links should be mailed to the corresponding
> user.
> 
> Is it possible to do that with ht://Dig or are there better solution(s).
> If 'its possible, could you tell me how I can do that with ht:/Dig? I
> have the intention to use an existing search engine!

This depends a lot on what these "remote sites" are.  If you're dealing
with a reasonably small number of sites that you can index regularly
with htdig, and you have disk space for an index of these sites, then
the project seems feasible.  If these remote sites can potentially be
any web sites out on the Internet, then you'd be better off sending the
queries off to a general web search engine like Google.

In either case, it seems to me the bulk of this project would be to design
the system that takes user requests, queues them up, and then runs the
queries one by one and mails the results.  I think a discussion of how
to do this is way beyond the scope of this mailing list.

-- 
Gilles R. Detillieux              E-mail: <[EMAIL PROTECTED]>
Spinal Cord Research Centre       WWW:    http://www.scrc.umanitoba.ca/~grdetil
Dept. Physiology, U. of Manitoba  Phone:  (204)789-3766
Winnipeg, MB  R3E 3J7  (Canada)   Fax:    (204)789-3930

_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a 
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to