hi,
to work around the problem of loosing pages when the disk is full
i am parsing the fetch output and try to reget the failing ones:
awk '/Internal Error/ { print $3 }' ~/fetch.log-3.24 | xargs wwwoffle
when doing the above in online mode then NOTHING gets fetched,
even though wwwoffle claims to get the pages.
in offline mode urls are added to the outgoing dir,
but the wrong ones. when running wwwoffle -fetch,
it outputs the following:
Cannot fetch http://localhost:8080/refresh/?http://www.slashdot.org/ [Error with
refresh URL]
i ended up using wget to get the pages:
awk '/Internal Error/ { print $3 }' ~/fetch.log-3.24 | xargs wget -p
just upgraded to 2.7a...
greetings, martin.
--
i am looking for a job anywhere in the world, doing pike programming,
caudium/pike/roxen training, roxen/caudium and/or unix system administration.
--
pike programmer Traveling in Singapore (www|db).hb2.tuwien.ac.at
unix (iaeste|bahai).or.at (www.archlab|iaeste).tuwien.ac.at
systemadministrator (stuts|black.linux-m68k).org mud.at is.(schon.org|root.at)
Martin B"ahr http://www.iaeste.or.at/~mbaehr/