> > I run htidg och MacOS X Server with a G3 with 384 MB of physical memory,
> > when htdig hangs the server the database is about 800 MB large, is it
> > possible that the site and hence the databse is to large to
> > handle for htdig
> > and the hardware?
>
>       I would say that you have answered your own question. :)

        Actually, all you need to do is do the 'htdig' in multiple passes. Then
htmerge them together.

        The 'htdig' process consumes more and more memory as it runs. This might be
due to memory leaks, or it might be legimitately due to it keeping track of
all the URLs it has to process. I tried htdigging 250,000 documents and hit
about 180Mb.

        DS


------------------------------------
To unsubscribe from the htdig mailing list, send a message to
[EMAIL PROTECTED]
You will receive a message to confirm this.

Reply via email to