On Sat, 18 Jul 2009, Andrew Brampton wrote:
Date: Sat, 18 Jul 2009 18:09:54 +0100
From: Andrew Brampton brampton+free...@gmail.com
To: Joe R. Jah j...@cloud.ccsf.cc.ca.us
Cc: freebsd-questions@freebsd.org
Subject: Re: OT: wget bug
2009/7/18 Joe R. Jah j...@cloud.ccsf.cc.ca.us:
Thank you
On Sat, 18 Jul 2009, Karl Vogel wrote:
Date: Sat, 18 Jul 2009 19:34:24 -0400 (EDT)
From: Karl Vogel vogelke+u...@pobox.com
To: freebsd-questions@freebsd.org
Subject: Re: OT: wget bug
On Sat, 18 Jul 2009 09:41:00 -0700 (PDT),
Joe R. Jah j...@cloud.ccsf.cc.ca.us said:
J Do you know
2009/7/17 Joe R. Jah j...@cloud.ccsf.cc.ca.us:
Hello all,
I want to wget a site at regular intervals and only get the updated pages,
so I use the this wget command line:
wget -b -m -nH http://host.domain/Directory/file.html
It works fine on the first try, but it fails on subsequent tries
On Sat, 18 Jul 2009, Andrew Brampton wrote:
Date: Sat, 18 Jul 2009 12:52:07 +0100
From: Andrew Brampton brampton+free...@gmail.com
To: Joe R. Jah j...@cloud.ccsf.cc.ca.us
Cc: freebsd-questions@freebsd.org
Subject: Re: OT: wget bug
2009/7/17 Joe R. Jah j...@cloud.ccsf.cc.ca.us:
Hello
2009/7/18 Joe R. Jah j...@cloud.ccsf.cc.ca.us:
Thank you Andrew. Yes the server is truly returning 401. I have already
reconfigured wget to download everything regardless of their timestamp,
but it's a waste of bandwidth, because most of the site is unchanged.
Do you know of any workaround
On Sat, 18 Jul 2009 09:41:00 -0700 (PDT),
Joe R. Jah j...@cloud.ccsf.cc.ca.us said:
J Do you know of any workaround in wget, or an alternative tool to ONLY
J download newer files by http?
curl can help for things like this. For example, if you're getting
just a few files, fetch only
Hello all,
I want to wget a site at regular intervals and only get the updated pages,
so I use the this wget command line:
wget -b -m -nH http://host.domain/Directory/file.html
It works fine on the first try, but it fails on subsequent tries with the
following error message:
--8--
Connecting
On Friday 17 July 2009 06:12:33 pm Joe R. Jah wrote:
I want to wget a site at regular intervals and only get the updated
pages, so I use the this wget command line:
wget -b -m -nH http://host.domain/Directory/file.html
It works fine on the first try, but it fails on subsequent tries with
On Fri, 17 Jul 2009, John Nielsen wrote:
Date: Fri, 17 Jul 2009 18:52:46 -0400
From: John Nielsen li...@jnielsen.net
To: freebsd-questions@freebsd.org
Cc: Joe R. Jah j...@cloud.ccsf.cc.ca.us
Subject: Re: OT: wget bug
On Friday 17 July 2009 06:12:33 pm Joe R. Jah wrote:
I want to wget