-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Yes, that's what it means.
I'm not yet committed to doing this. I'd like to see first how many
mainstream servers will respect If-Modified-Since when given as part of
an HTTP/1.0 request (in comparison to how they respond when it's part of
an
This mean we should remove the previous HEAD request code and use
If-Modified-Since by default and have it to handle all the request and
store pages if it is not returning a 304 response
Is it so?
On Fri, Aug 29, 2008 at 11:06 PM, Micah Cowan [EMAIL PROTECTED] wrote:
Follow-up Comment #4, bug
Don't know if that's a known issue, but if I want to use timestamping (-N)
and --output-document at the same time, the file is downloaded everytime. So
it seems like size and time of output-document don't get used for reference.
Which is a pity. This is GNU Wget 1.9.1.
From: purp
Don't know if that's a known issue, [...]
Try the Search feature at:
http://www.mail-archive.com/wget@sunsite.dk/
For example:
http://www.mail-archive.com/search?q=%22-O%22+%22-N%22[EMAIL PROTECTED]
where you can see several previous similar complaints, and the
For the record:
http://www.mail-archive.com/search?q=%22-O%22+%22-N%22[EMAIL PROTECTED]
was actually more like:
http://www.mail-archive.com/search?q=%22-O%22+%22-N%22l= wget at
sunsite.dk
before it got PROTECTED.
SMS.
thanks to sourceforge,
here is a url from soruceforge net,
http://images.sourceforge.net/icons/silk/feed.png
has a Last-Modified header of 'Tue, 05 Dec 2006 19:10:40 GMT'
consider these two command:
1,
wget -N -O dir/feed.png \
http://images.sourceforge.net/icons/silk/feed.png
wget
On 4/23/07, Tony Lewis [EMAIL PROTECTED] wrote:
n g wrote:
wget url -O dir/name -N
would download the same file every run.
while
wget url -O name -N
works as expected.
timestamping compares the timestamp on the local file with the timestamp on
the server. When you use -O
On 4/23/07, Steven M. Schweda [EMAIL PROTECTED] wrote:
From: n g
another problem about -N option:
No, it's the same problem with the -O option, which does not work
the way you seem to think that it works. If you go to
i guess you are right. its all about `-O' option.
wget url -O dir/name -N
would download the same file every run.
while
wget url -O name -N
works as expected.
wget version=1.10.2
Hello,
I'm using wget 1.10.2 on windows to mirror an ftp directory.
essentially one line:
wget --timestamping ftp://ftp.com/pub/updates/* -o ..\Update.log
an local application then scans through the resulting files for valid
updates.
As the update directory has grown over the years
Dear wget developers.
I'm sure this has been reported before, and I've seen references to
it going back all the way to 2003 but the problem I'm facing is still
there in wget version 1.10.2.
When I turn on --timestamping I suspect, as the manual says, that the
time tags are preserved
From: Remko Scharroo:
Can this be fixed?
Of course it can be fixed, but someone will need to fix it, which
would involve defining the user interface and adding the code to do the
actual time offset. I assume that the user will need to specify the
offset.
For an indication of what could
Hi,
I have tried out the wget alpha under Linux and found that the timestamping
option (which I usually have defined) does not work correctly.
First thing I saw, that on *every* download I got a line
Remote file is newer, retrieving.
in the output, even when there was no local file
hi
is there a way to use the timestamping future but to force wget to
output the file to another filename? if i use only timestamping
everything works fine, but if i use another output document
(--output-document) it ignores the timestamps?
i used GNU Wget 1.9.1 from debian sarge.
greets
i've just found this [1] open debian bug report. it's open since 26 Jul
2003 and still not corrected in the recent version of wget :-(
greets
KoS
[1] http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=202911
--
Martin Kos +41-76-384-93-33
http://kos.li
hi
i've just posted my comments on the mailinglist [1]. wget doesn't behave
the right way if i use the out --output-document option and
--timestamping together. wget tries to compare the url-file with the
original file instead with the --output-document file.
why i got to this problem
I'm curious. Currently, -O may be used with multiple URLs on the
command line. What would be the right way for this to work with -N?
Steven M. Schweda (+1) 651-699-9818
382 South Warwick Street
Jorge Pereira wrote:
Hi,
When using wget with -N and -O, the comparison is not done to the file
specified with -O, it is done to whatever filename the server replies
with (and it doesn't exist, because it's being written under a different
name, so no time comparison is done).
I searched
Hi,
When using wget with -N and -O, the comparison is not done to the file
specified with -O, it is done to whatever filename the server replies
with (and it doesn't exist, because it's being written under a different
name, so no time comparison is done).
I searched around and can't seem to
I think --timestamping fails for files 2Gb
wget tries to download the file again with the .1 extension (as if you
were not using --timestamping).
This only happens to a big file in a list of files I am wgetting.
Dan Bolser [EMAIL PROTECTED] writes:
I think --timestamping fails for files 2Gb
Thanks for the report. Wget 1.9.x doesn't support 2+GB files, not
only for timestamping. You can try Wget 1.10-beta from
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.bz2
On Sat, 28 May 2005, Hrvoje Niksic wrote:
Dan Bolser [EMAIL PROTECTED] writes:
I think --timestamping fails for files 2Gb
Thanks for the report. Wget 1.9.x doesn't support 2+GB files, not
only for timestamping. You can try Wget 1.10-beta from
ftp://ftp.deepspace6.net/pub/ds6/sources/wget
Hello,
I recently installed wget 1.9.1 on Windows 2000 and I have been
experimenting with it. Mostly, it works very well. Many thanks to
the authors! But I did find a problem with the -N option
(timestamping) when mirroring an FTP site that uses the MUSIC/SP ftp
server. Wget uses the info from
equivalent to -r -N
-l inf -nr.
Isn't --timestamping missing in this equivalence list? ... Oh, I see,
-N is --timestamping.
Grr, I belive it should be mentioned in words here for clarity.
You yourself quoted that part of the manual which mentions it in words:
This option turns on recursion
I try update a file with following command:
***
wget -N --proxy=off --cache=off http://172.17.8.14/etrust/vet.dat -O
c:\temp\kkk\xx.dat
--11:17:01-- http://172.17.8.14/etrust/vet.dat
= `c:/temp/kkk/xx.dat'
Connecting to 172.17.8.14:80... connected.
HTTP request sent, awaiting
Hello. When using '--timestamping --retr-symlinks', the time stamping
logic seem to compare the date of the symbolic link with the local
copy of the file. But wget will download the real file, and set the
local date to the same as the real file. So if the symbolic link, on
the server, has
Rick Goyette [EMAIL PROTECTED] writes:
The local and remote files have different sizes, which I thought
(after reading the man page) should flag wget to grab it. But it
does not.
It should. Do you use HTTP or FTP to get the file? Can you post a
debug log (possibly edited for confidential
that, and
the modification date, which is updated whenever the file is modified.
When I use wget with timestamping (wget -N) and poll the VMS system for
a file, the date returned seems to be the creation date, not the
modification date. A user sets up a run file on Monday afternoon,
which
I've been using wget for the last time in order to retrieve mirrors of some
web sites. Recently i discovered the -N option. When i use it checks if the
local files are older than the server files (same filenames) and if the last
one is newer than the local file then it overwrites the local one
Hello,
Tsabros Leonidas wrote:
I've been using wget for the last time in order to retrieve mirrors of some
web sites. Recently i discovered the -N option. When i use it checks if the
local files are older than the server files (same filenames) and if the last
one is newer than the local
Yes, I beleive the option you are looking for is --backup-converted or to
make thing easier -K (make sure it is capital) is an alias for the same
thing.
Craig Sowadski
_
Dream of owning a home? Find out how in the First-time Home
Hi All,
I'm using Wget 1.8.2 on a Redhat 9.0 box equipped with
Athlon 550 MHz cpu, 128 MB Ram.
I've encountered a strange issue, which seem really a bug, using the
timestamping option.
I'm trying to retrieve the http://www.nic.it/index.html page.
The HEAD HTTP method returns that page is 2474
1.8.2 timestamping bug
Hi All,
I'm using Wget 1.8.2 on a Redhat 9.0 box equipped with
Athlon 550 MHz cpu, 128 MB Ram.
I've encountered a strange issue, which seem really a bug, using the
timestamping option.
I'm trying to retrieve the http://www.nic.it/index.html page.
The HEAD HTTP method
Hi all!
I am not 100% sure why this is so, but it is reproducable on my several
linux systems. So:
1. Create a new directory and cd to it (mkdir /tmp/mydir /tmp/mydir)
2. Run wget with an ftp site to get a dir (wget --recursive
ftp://ftp.gnu.org/pub/gnu/xinfo*) for example
3. See the time of
DCA This isn't a bug, but the offer of a new feature. The timestamping
DCA feature doesn't quite work for us, as we don't keep just the latest
DCA view of a website and we don't want to copy all those files around for
DCA each update.
Which brings me to mention two features I've been meaning
The other thing more or less is ripped from the Windows DL-Manager
FlashGet (but why not). Wouldn't it be useful if wget retrieves a file
to a temporary renamed filename, for instance with the extension .wg! or
something and renamed back to the original name after finishing? Two
TL advantages
This isn't a bug, but the offer of a new feature. The timestamping
feature doesn't quite work for us, as we don't keep just the latest
view of a website and we don't want to copy all those files around for
each update.
So I implemented a --changed-since=mmdd[hhmm] flag to only get
files
Unfortunately, this bug is not easy to fix. The problem is that `-O'
was originally invented for streaming, i.e. for `-O -'. As a result,
many places in Wget's code assume that they can freely operate on the
file names, and -O seems more like an afterthought.
On the other hand, many people
Hi,
I am forwarding to you Debian bug 88176.
(http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=88176repeatmerged=yes)
I can reproduce the problem with 1.8.1
The following transcript shows that the wget can do the Bad Thing with
-O when timestamping.
It can result on a 0 byte long result
On 1 Feb 2002 at 8:17, Daniel Stenberg wrote:
You may count this mail as advocating for HTTP 1.1 support, yes! ;-)
I did write down some minimal requirements for HTTP/1.1 support on
a scrap of paper recently. It's probably still buried under the
more recent strata of crap on my desk somewhere!
On Fri, 1 Feb 2002, Ian Abbott wrote:
The proper action (IMHO) would be to use a true HTTP/1.1 request and
thus most likely receive a chunked transfer-encoded data stream back,
Does PHP do that?
PHP does that. With the help of Apache of course.
Surely it wouldn't be much difference, as
Hi,
If I understand timestamping corretly, wget will look at
the content-length header and if the length is different than
the local copy , wget will reget the web page even if the
the remote file is older/same_as the local copy.
The problem is, that my web pages are served up by php
On 31 Jan 2002 at 8:41, Bruce BrackBill wrote:
The problem is, that my web pages are served up by php
and the content lengh is not defined. So as the manual states
I use --ignore-length. But when wget retrieves an image
it slows right down, possibly because it is ignoring
the
From: Ian Abbott [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
CC: Bruce BrackBill [EMAIL PROTECTED]
Subject: Re: timestamping content-length --ignore-length
Date: Thu, 31 Jan 2002 17:24:11 -
On 31 Jan 2002 at 8:41, Bruce BrackBill wrote:
The problem is, that my web pages are served up by php
sending me a content_length i'm just going to download it
again anyway :-). According the the manual ( as I read it )
wget should ALWAYS reget the file if it has an empty content
length ( even though this is undesirable behavior ).
Sorry I ignored the timestamping part of your question. My
On Thu, 31 Jan 2002, Ian Abbott wrote:
The problem is, that my web pages are served up by php and the content
lengh is not defined. So as the manual states I use --ignore-length.
But when wget retrieves an image it slows right down, possibly because it
is ignoring the content-length.
On 11 Dec 2001 at 18:40, [EMAIL PROTECTED] wrote:
It seems to me that if an output_document is specified, it is being
clobbered at the very beginning (unless always_rest is true). Later in
http_loop stat() comes up with zero length. Hence there's always a size
mismatch when --output-document
[EMAIL PROTECTED] writes:
But it's as documented in the man page. The option is meant for
concatenating several pages into one big file, and you can't
meaningfully compare timestamps or file sizes in that case.
Ah, so this behaviour is by design. Even so, the behaviour is
slightly
Hrvoje == Hrvoje Niksic [EMAIL PROTECTED] writes:
Hrvoje [EMAIL PROTECTED] writes:
But it's as documented in the man page. The option is meant for
concatenating several pages into one big file, and you can't
meaningfully compare timestamps or file sizes in that case.
On 11/12/2001 14:03:54 Adrian Aichner wrote:
Hi Wgeteers!
Is
-N, --timestamping don't retrieve files if older than local.
supposed to work on windows 2000?
[snip]
cd c:\Hacking\SunSITE.dk\xemacsweb\Download\win32\
%TEMP%\wget.wip\src\wget.exe --debug --timestamping
--output
Suppose I have page a.html, which has a link to b.html. If a is not
changed, and b is changed. When I process a, I have no way to check a so
that I can process b too, without downloading a. -N will cause a not to be
downloaded, but not processed either, so change of b will be ignored. If I
will
On 4 Aug 2001, at 3:25, Bao, Jiangcheng wrote:
Suppose I have page a.html, which has a link to b.html. If a is not
changed, and b is changed. When I process a, I have no way to check a so
that I can process b too, without downloading a. -N will cause a not to be
downloaded, but not processed
, and thus missed the
fact that some other pages being linked by index.html might have been
changed.
Am I right? Or I am wrong? Thanks.
Date: Sat, 4 Aug 2001 17:15:45 +0100
From: Ian Abbott [EMAIL PROTECTED]
To: [EMAIL PROTECTED], Bao, Jiangcheng [EMAIL PROTECTED]
Subject: Re: wget timestamping
Say, I have a index.html which is not changed, but some of the pages
linked from this page might be changed. When I use -N option to retrieve
index.html recursively, wget will quit after find out that index.html is
not changed, without following the url in index.html, and thus missed the
At 07:11 PM 8/4/01 -0500, Mengmeng Zhang wrote:
Say, I have a index.html which is not changed, but some of the pages
linked from this page might be changed. When I use -N option to retrieve
index.html recursively, wget will quit after find out that index.html is
not changed, without
: Re: wget timestamping (-N) bug/feature?
At 07:11 PM 8/4/01 -0500, Mengmeng Zhang wrote:
Say, I have a index.html which is not changed, but some of the pages
linked from this page might be changed. When I use -N option to retrieve
index.html recursively, wget will quit after find out
56 matches
Mail list logo