Hi there.

Anyone has experience with indexing 
a large amount of files? I am very 
satisfied with ht://dig succesfully on 
smaler sites, but this one is different.

I have at about 200,000 plain text files
spread over a few 100, maybe 1000, directories. 
File size is between a few bytes and, sometimes,
above 1mb. All in all this ends up in 1.2gb
of data, growing daily. The files do not
contain HTML code and I need them to be 
indexed at least daily (that is, nightly ;-)
Most of the files are static, only few of them 
change, say, 100-200 a day.

We have been using glimpse so far, but it's
not running well, indexing takes much to long,
among other problems.

Wonder if ht://dig could help here, or if
anyone knows of a different solution.

Thanks in advance,
Marcel


------------------------------------
To unsubscribe from the htdig mailing list, send a message to
[EMAIL PROTECTED]
You will receive a message to confirm this.

Reply via email to