Dan Jacobson <[EMAIL PROTECTED]> writes:

> [I'm using 2.7 beta still] here are accumulated comments:
> 
> [Sometimes I request .tgz, .tar.gz files, but they end up as
> uncompressed tar files still named .tgz, .tar.gz on disk... I didn't
> explore further if this is netscape or wwwoffle's problem.]

I think that I have seen Netscape do this before.  I can't be sure
that it is not WWWOFFLE though, there do seem to be some problems at
the moment.

> Hmmmm, no way to add to the monitored items without starting the
> interface.  Can add to the outgoing items from the command line, but
> not the monitored items.  Also, no way to get a report on the
> monitor parameters with wwwoffle-ls etc., must start the interface, etc.

> Indeed, I want to add about 20 pages to the monitor list from the
> command line.  All their parameters are to be the same.  Frustration:
> the user must do these one by one from the window.  Ray of hope: I
> will hack wwwoffle/monitored/*  Ray of despair: looks too complicated. 

There are quite a lot of parameters for the monitor requests.  Indeed
there are four comma separated lists (months of year, days of week,
days of month, hours of day).  Whilst a command line interface could
be added I don't imagine that many people would ever use it.  Isn't
this what 'lynx -post_data' is for :-)

> By the way, before I connect, I like to see all the things we are
> about to fetch.  I can do wwwoffle-ls outgoing; wwwoffle-ls monitored,
> however there is no way just to only list the monitored items that
> will be fetched and not list the ones that won't.

As a Lynx user and shell script wizard I thought that you would have
guessed this one:

lynx -dump http://localhost:8080/index/monitor | grep N=0:0

> Just playing around, and
> # wwwoffle-hash http://groups.google.com/
> Segmentation fault

No problem here.

> I notice there are lots of 0 byte files accruing
> /var/spool/wwwoffle/temp/tmp.25635 ...
> about 20 since December

These can only be caused if the WWWOFFLE process dies.  I can't guess
what the cause was without knowing the URL and the options that you
have set.  You might want to check your syslog of WWWOFFLE logfile for
any "terminated with signal 11" messages.

> This might have been fixed:
> for a already cached URL do while offline:
> $ wwwoffle URL
> $ wwwoffle-ls outgoing
> will be empty, despite the man page.  Need wwwoffle -F

This is fixed in version 2.7 (I had it on the TODO list when you
raised it, but there wasn't time to get it into the beta version).


> Another problem is newspapers, who often remove items after a
> while... "darn, that used to be in the cache, I wish I didn't
> [request] it again, now it has been replaced by a membership form,
> etc. [not 404]."
> 
> OK, this brings up the concept like emacs' "the buffer has shrunk a
> lot, not autosaving" ... how to get back the previous version of
> something now ruined... hey, google has a "cache, in case that site is
> down", why not wwwoffle also have a "cache, incase that site is
> sideways, but not down" i.e. at the bottom of each retrieved web page,
> also have [previous] or [backup] to click on.  Say just save one copy
> for starters... and even throw it away after 7 days or read once and
> not clicked or something.
> 
> Yes, say for instance my pals at www.sunriver.com.tw .  It was a
> beautiful website until they misconfigured it... now it looks like a
> ftp directory [not 404].  i will call them in the morning.  the moment
> I see it thru wwwoffle that means the old version is now gone for
> good, only memories in my brain.  I wish there was some way to bring
> that page back [ok, maybe the google cache has it] ... anyways, it
> would be neat if somehow wwwoffle could give us a 2nd chance.
> 
> Notice that I'm talking about when something has changed and it is
> already too late to do anything about it when you notice it.

We had this discussion a long time ago, but we talked about archiving
sites or keeping a history of sites.


> In lynx at least, when the user hits the following buttons, and the
> given program has not been installed, etc. at least WWWOFFLE should
> say something about "tell the administrator to install it", instead of
> drawing blanks like i did when i hit the Namazu button [is this new
> guy better than ht/dig...?]

OK, fixed for the next version (sort of).  In case the search script
returns an error (e.g. not installed) then a WWWOFFLE error page is
returned instead of a blank page.  Not perfect, but better than
nothing.


> WWWOFFLE Cache Searching
>                         
>    With the installation of any of the programs ht://Dig or mnoGoSearch (UdmSearch) 
>or
>    Namazu it is possible to search the pages in the WWWOFFLE cache.
> 
>    /search/htdig/search.html
>           The ht://Dig search form.
> 
>    /search/mnogosearch/search.html
>           The mnoGoSearch (UdmSearch) search form.
> 
>    /search/namazu/search.html
>           The Namazu search form.
> 
> Also it still really bugs me that you use absolute filenames
> instead of "search/namazu/search.html" with no slash in front, but I
> remember you already answered that.

I don't remember the answer, but I am sure it was a good one :-)

-- 
Andrew.
----------------------------------------------------------------------
Andrew M. Bishop                             [EMAIL PROTECTED]
                                      http://www.gedanken.demon.co.uk/

WWWOFFLE users page:
        http://www.gedanken.demon.co.uk/wwwoffle/version-2.7/user.html

Reply via email to