Hi (sorry for the delay),

> When refreshing pages, one has the option to specify whether or not to
> get images and other files referenced in the page.  Is it also possible
> to recurse to level x when *monitoring* the page?  I do not see how to
> do that.  How can I tell WWWOFFLE to monitor a page with all images to
> recursion depth 2 for example?

You need to select the page that you want to monitor using the refresh
options page.  Then there is the option to monitor the page, if you
select this option then you get the normal list of monitor options.


> I have a local HTML file with links I refresh from time to time.  I have
> stored it locally under http://localhost:8080/local/refresh.html.  What
> I would like WWWOFFLE to do is to monitor that page daily to depth 1.
> Problem is that WWWOFFLE apparently cannot request pages from itself.

This is correct, WWWOFFLE does not allow you to monitor a page from
itself.  I guess that I didn't think about the possiblity of using it
the way that you want to.  In principle there is no reason to monitor
a page from itself or to refresh a page from itself.

> Is there a way I can trick the program into doing what I want?  I tried
> using another host name that also resolves to 127.0.0.1 but that did not
> work.  WWWOFFLE even gave me an error and suggested that I should report
> it to the author if the problem reoccurred.

You could try using the command line rather than using the WWWOFFLE
monitor option.

wwwoffle ~/url-list.html

you can also use any of the other options on the command line like
'-gi' and '-F' etc.


> How do I request password protected pages like
> http://quotes.ubs.com/myquotes/ ?  I tried fetching it normally, then
> giving the password and refetching the resulting page and of course the
> usual suspect http://user:[EMAIL PROTECTED]/myquotes/.
> Unfortunately, both did not yield the desired result.  What am I doing
> wrong?

I don't know how the passwording works on these pages.  If it uses
cookies then there is no way to do it with WWWOFFLE.

If it doesn't use cookies, but uses the normal HTTP authentication
methods then have a look at the Authentication Problems section of

http://www.gedanken.demon.co.uk/wwwoffle/version-2.6/server.html


> My last question deals with the possibility to add files to the WWWOFFLE
> cache without downloading them piece by piece.  Some websites like
> http://go.to/wingnus offer the option to download the complete website
> in a single compressed file which is of course faster and much more
> convenient than spidering through all the pages.  Some homepages are
> even available on CD-ROM (I know about one HTML tutorial).  Now I was
> wondering if it is possible to merge those files into the wwwoffle
> cache.  Is there a tool or a procedure to do that?

There is no procedure to do this, but the wwwoffle-tools programs will
allow you to write arbitrary files into the WWWOFFLE cache.  Have a
look at the WWWOFFLE README file for the information about these
programs.  They work, but you need to follow the instructions and try
it, there is no automatic procedure for doing it.

-- 
Andrew.
----------------------------------------------------------------------
Andrew M. Bishop                             [EMAIL PROTECTED]
                                      http://www.gedanken.demon.co.uk/

WWWOFFLE users page:
        http://www.gedanken.demon.co.uk/wwwoffle/version-2.6/user.html

Reply via email to