[I'm using 2.7 beta still] here are accumulated comments:

[Sometimes I request .tgz, .tar.gz files, but they end up as
uncompressed tar files still named .tgz, .tar.gz on disk... I didn't
explore further if this is netscape or wwwoffle's problem.]

Hmmmm, no way to add to the monitored items without starting the
interface.  Can add to the outgoing items from the command line, but
not the monitored items.  Also, no way to get a report on the
monitor parameters with wwwoffle-ls etc., must start the interface, etc.

Indeed, I want to add about 20 pages to the monitor list from the
command line.  All their parameters are to be the same.  Frustration:
the user must do these one by one from the window.  Ray of hope: I
will hack wwwoffle/monitored/*  Ray of despair: looks too complicated. 

By the way, before I connect, I like to see all the things we are
about to fetch.  I can do wwwoffle-ls outgoing; wwwoffle-ls monitored,
however there is no way just to only list the monitored items that
will be fetched and not list the ones that won't.

Just playing around, and
# wwwoffle-hash http://groups.google.com/
Segmentation fault

I notice there are lots of 0 byte files accruing
/var/spool/wwwoffle/temp/tmp.25635 ...
about 20 since December

This might have been fixed:
for a already cached URL do while offline:
$ wwwoffle URL
$ wwwoffle-ls outgoing
will be empty, despite the man page.  Need wwwoffle -F

-----------
Another problem is newspapers, who often remove items after a
while... "darn, that used to be in the cache, I wish I didn't
[request] it again, now it has been replaced by a membership form,
etc. [not 404]."

OK, this brings up the concept like emacs' "the buffer has shrunk a
lot, not autosaving" ... how to get back the previous version of
something now ruined... hey, google has a "cache, in case that site is
down", why not wwwoffle also have a "cache, incase that site is
sideways, but not down" i.e. at the bottom of each retrieved web page,
also have [previous] or [backup] to click on.  Say just save one copy
for starters... and even throw it away after 7 days or read once and
not clicked or something.

Yes, say for instance my pals at www.sunriver.com.tw .  It was a
beautiful website until they misconfigured it... now it looks like a
ftp directory [not 404].  i will call them in the morning.  the moment
I see it thru wwwoffle that means the old version is now gone for
good, only memories in my brain.  I wish there was some way to bring
that page back [ok, maybe the google cache has it] ... anyways, it
would be neat if somehow wwwoffle could give us a 2nd chance.

Notice that I'm talking about when something has changed and it is
already too late to do anything about it when you notice it.
==========
In lynx at least, when the user hits the following buttons, and the
given program has not been installed, etc. at least WWWOFFLE should
say something about "tell the administrator to install it", instead of
drawing blanks like i did when i hit the Namazu button [is this new
guy better than ht/dig...?]

WWWOFFLE Cache Searching
                        
   With the installation of any of the programs ht://Dig or mnoGoSearch (UdmSearch) or
   Namazu it is possible to search the pages in the WWWOFFLE cache.

   /search/htdig/search.html
          The ht://Dig search form.

   /search/mnogosearch/search.html
          The mnoGoSearch (UdmSearch) search form.

   /search/namazu/search.html
          The Namazu search form.

Also it still really bugs me that you use absolute filenames
instead of "search/namazu/search.html" with no slash in front, but I
remember you already answered that.
-- 
http://www.geocities.com/jidanni/ Taiwan(04)25854780

Reply via email to