Re: profiling with wget

2003-11-14 Thread Robert Parks
I'm sorry I wasn't more specific. I'm just interested in getting
more accurate timing data than wget gives, for example data
retrieval times that have better resolution than GetSystemTime().
I also want the option of not writing the retrieved data to a file, or
at least removing that from the retrieval time measurement.
Thanks,

Robert Parks
[EMAIL PROTECTED]
On Friday, November 14, 2003, at 02:39  PM, Hrvoje Niksic wrote:

"Robert Parks" <[EMAIL PROTECTED]> writes:

I am looking for a tool like wget, that will report profiling data,
rather than only regular time measurement. Are there any tools out
there have that sort of functionality?
What kind of profiling data are you referring to, exactly?

If not, is there any interest in adding this sort of feature to
wget? Or is that completely out of the scope of what people want
wget to do?
It might be, but it's hard to say for sure without a more complete
description.




Re: profiling with wget

2003-11-14 Thread Hrvoje Niksic
"Robert Parks" <[EMAIL PROTECTED]> writes:

> I'm sorry I wasn't more specific. I'm just interested in getting
> more accurate timing data than wget gives, for example data
> retrieval times that have better resolution than GetSystemTime().

I assume you're using Windows?  If there is a way to get better
resolution than GetSystemTime, I'm definitely interested.

> I also want the option of not writing the retrieved data to a file,
> or at least removing that from the retrieval time measurement.

On Unix, you can use `-O /dev/null' to avoid writes to disk.  (The
application is still writing to an output stream, but the data is lost
in a black hole.)  I'm not sure if there's an equivalent under
Windows.



Re: profiling with wget

2003-11-14 Thread Hrvoje Niksic
"Robert Parks" <[EMAIL PROTECTED]> writes:

> I am looking for a tool like wget, that will report profiling data,
> rather than only regular time measurement. Are there any tools out
> there have that sort of functionality?

What kind of profiling data are you referring to, exactly?

> If not, is there any interest in adding this sort of feature to
> wget? Or is that completely out of the scope of what people want
> wget to do?

It might be, but it's hard to say for sure without a more complete
description.



profiling with wget

2003-11-14 Thread Robert Parks
I am looking for a tool like wget, that will report profiling data,
rather than only regular time measurement. Are there any tools out there
have that sort of functionality?
If not, is there any interest in adding this sort of feature to
wget? Or is that completely out of the scope of what people want wget
to do?
Please cc me on any replies as I am not on the mailing list.

Thanks,

Robert Parks
FileMaker, Inc.
[EMAIL PROTECTED]



Re: windows devel binary

2003-11-14 Thread Hrvoje Niksic
Herold Heiko <[EMAIL PROTECTED]> writes:

> Windows MSVC binary for current cvs at http://xoomer.virgilio.it/hherold/
>
> This is a bit of a "Doctor, if I do this it hurts. - So don't do
> that!", but I think this should not happen:
>
> D:\Wip\Wget\wget.wip\src>wget -4
> Assertion failed: 0 <= comind && comind < countof (commands), file init.c,
> line 589

You're right, it shouldn't.  (It happens only when IPv6 is disabled,
that's why I didn't see it.)

> This is a binary compiled and run on windows nt 4, which doesn't support
> IPV6, so the -4 should probably be a no-op ?

Or not work at all.

> As I said, shouldn't really be done, IPV6 code is in development and
> so on, so this is just a FYI.

Thanks for the report.  This patch should fix it:


2003-11-14  Hrvoje Niksic  <[EMAIL PROTECTED]>

* main.c: Enable -4 and -6 only if IPv6 is enabled.

Index: src/main.c
===
RCS file: /pack/anoncvs/wget/src/main.c,v
retrieving revision 1.100
diff -u -r1.100 main.c
--- src/main.c  2003/11/11 21:48:35 1.100
+++ src/main.c  2003/11/14 15:08:07
@@ -194,8 +194,10 @@
 { "ignore-length", 0, OPT_BOOLEAN, "ignorelength", -1 },
 { "ignore-tags", 0, OPT_VALUE, "ignoretags", -1 },
 { "include-directories", 'I', OPT_VALUE, "includedirectories", -1 },
+#ifdef ENABLE_IPV6
 { "inet4-only", '4', OPT_BOOLEAN, "inet4only", -1 },
 { "inet6-only", '6', OPT_BOOLEAN, "inet6only", -1 },
+#endif
 { "input-file", 'i', OPT_VALUE, "input", -1 },
 { "keep-session-cookies", 0, OPT_BOOLEAN, "keepsessioncookies", -1 },
 { "level", 'l', OPT_VALUE, "reclevel", -1 },
@@ -455,10 +457,12 @@
--dns-cache=off   disable caching DNS lookups.\n"),
 N_("\
--restrict-file-names=OS  restrict chars in file names to ones OS allows.\n"),
+#ifdef ENABLE_IPV6
 N_("\
   -4,  --inet4-only  connect only to IPv4 addresses.\n"),
 N_("\
   -6,  --inet6-only  connect only to IPv6 addresses.\n"),
+#endif
 "\n",
 
 N_("\


windows devel binary

2003-11-14 Thread Herold Heiko
Windows MSVC binary for current cvs at http://xoomer.virgilio.it/hherold/

This is a bit of a "Doctor, if I do this it hurts. - So don't do that!", but
I think this should not happen:

D:\Wip\Wget\wget.wip\src>wget -4
Assertion failed: 0 <= comind && comind < countof (commands), file init.c,
line 589

This is a binary compiled and run on windows nt 4, which doesn't support
IPV6, so the -4 should probably be a no-op ?  As I said, shouldn't really be
done, IPV6 code is in development and so on, so this is just a FYI.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: convert links problem

2003-11-14 Thread McComber . DM
Yes, I can install it in my home dir but I'd get in a lot of trouble.  It's
a government server and is tightly controlled.  But my need is legitimate so
the upgrade request should be approved.  I'll post a follow up after the
upgrade.

Thanks for all your help,
Doug

-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Friday, November 14, 2003 9:35 AM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: convert links problem


[EMAIL PROTECTED] writes:

> Thank you for your reply.  Yes the gif file was downloaded.

Then this sounds like a bug.  Many bugs have been fixed in the several
years between the 1.7 and the 1.9.1 release, so it's all the more
reason to try the new one.

> I am not the administrator of the server wget runs from so all I can
> do is ask that it be upgraded.

You don't need to be the administrator to upgrade Wget; you can
compile it and install it in your home directory.  Simply download the
latest source, unpack it, and run `./configure --prefix=$HOME/wget-test'
followed by `make' and `make install'.  As a result, the latest Wget
will be available in ~/wget-test/bin.

> If you don't mind I will use your response as my justification.  Are
> you one of the developers?

I'm the maintainer, but my response is not a particularly good
justification because I don't even know if it will fix your problem.
But upgrade is a good idea anyway because it's fairly easy and because
a lot of other bugs have been fixed as well.


RE: Wget 1.9.1 has been released

2003-11-14 Thread Herold Heiko
Windows MSVC binary at http://xoomer.virgilio.it/hherold
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

> -Original Message-
> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
> Sent: Friday, November 14, 2003 2:55 AM
> To: [EMAIL PROTECTED]
> Subject: Wget 1.9.1 has been released
> 
> 
> Wget 1.9.1 is now available on ftp.gnu.org and its mirrors.  It is a
> bugfix release that contains fixes for several problem noted in the
> 1.9 release.  Unless further serious bugs are discovered, it is likely
> to remain the last in the 1.9.x series.
> 


Re: convert links problem

2003-11-14 Thread Hrvoje Niksic
[EMAIL PROTECTED] writes:

> Thank you for your reply.  Yes the gif file was downloaded.

Then this sounds like a bug.  Many bugs have been fixed in the several
years between the 1.7 and the 1.9.1 release, so it's all the more
reason to try the new one.

> I am not the administrator of the server wget runs from so all I can
> do is ask that it be upgraded.

You don't need to be the administrator to upgrade Wget; you can
compile it and install it in your home directory.  Simply download the
latest source, unpack it, and run `./configure --prefix=$HOME/wget-test'
followed by `make' and `make install'.  As a result, the latest Wget
will be available in ~/wget-test/bin.

> If you don't mind I will use your response as my justification.  Are
> you one of the developers?

I'm the maintainer, but my response is not a particularly good
justification because I don't even know if it will fix your problem.
But upgrade is a good idea anyway because it's fairly easy and because
a lot of other bugs have been fixed as well.


RE: convert links problem

2003-11-14 Thread McComber . DM
Thank you for your reply.  Yes the gif file was downloaded.  All files
required download successfully.  /home/intadm is not in any of the html
files downloaded.  This is why the problem has been so confusing!  I am not
the administrator of the server wget runs from so all I can do is ask that
it be upgraded.  If you don't mind I will use your response as my
justification.  Are you one of the developers?

Thank you,
Doug

-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 13, 2003 11:14 AM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: convert links problem


[EMAIL PROTECTED] writes:

> This is working well with one major problem.  Links, img tags,
> etc. are not being properly converted.  For example, if there is an
> img tag in the source file such as:
>
> 
>
> Then it is converted to:
>
> 
src="http://www.somedomain.com/home/intadm/public_html/folder/whatever.gif";>

Does Wget download this GIF file?

The link conversion works both ways: the files that are downloaded are
converted to relative references (something like "../foo/bar.gif").
But the files that have not been downloaded are converted to full
links (e.g. "http://server/foo/bar.gif";).

> The domain is the source domain not the domain of the machine
> running wget, and the path, although correct should start as
> ~intadm... to work.

Wget probably found "/home/intadm" in some of the URLs.  Inspect the
input files and you'll almost certainly find it somewhere.

> Also, when a link to another page on a source document is to a
> different domain as such:
>
> http://www.thisdomain.com/some/path/file.htm";>
>
> it is converted to:
>
> http://www.somedomain.com/file.htm";>

That sounds like a bug.  Wget 1.7 is several years old.  Could you
please try to compile Wget 1.9 and see if it works better?