unsubscribe me now otherwise messages will bounce back to you

2003-09-23 Thread lu001



Wget 1.9-beta1 is available for testing

2003-09-23 Thread Hrvoje Niksic
After a lot of time of sitting in CVS, a beta of Wget 1.9 is
available.  To see what's new since 1.8, check the `NEWS' file in the
distribution.  Get it from:

http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta1.tar.gz

Please test it on as many different platforms as possible and in the
places where Wget 1.8.x is currently being used.  I expect this
release to be extremely stable, but noone can guarantee that without
wider testing.  I didn't want to call it "pre1" or "rc1" lest I anger
the Gods.

One important addition scheduled for 1.9 and *not* featured in this
beta are Mauro's IPv6 improvements.  When I receive and merge Mauro's
changes, I'll release a new beta.

As always, thanks for your help.



RE: bug maybe?

2003-09-23 Thread Matt Pease
how do I get off this list?   I tried a few times before & 
got no response from the server.

thank you-
Matt

> -Original Message-
> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, September 23, 2003 8:53 PM
> To: Randy Paries
> Cc: [EMAIL PROTECTED]
> Subject: Re: bug maybe?
> 
> 
> "Randy Paries" <[EMAIL PROTECTED]> writes:
> 
> > Not sure if this is a bug or not.
> 
> I guess it could be called a bug, although it's no simple oversight.
> Wget currently doesn't support large files.
> 


Re: bug maybe?

2003-09-23 Thread Hrvoje Niksic
"Randy Paries" <[EMAIL PROTECTED]> writes:

> Not sure if this is a bug or not.

I guess it could be called a bug, although it's no simple oversight.
Wget currently doesn't support large files.



bug maybe?

2003-09-23 Thread Randy Paries
Not sure if this is a bug or not.
 
i can not get a file over 2GB (i get a MAX file Exceeded error message)
 
this is on a redhat 9 box. GNU Wget 1.8.2,
 
Thanks
Randy


Re: Portable representation of large integers

2003-09-23 Thread Maciej W. Rozycki
On Mon, 22 Sep 2003, Hrvoje Niksic wrote:

> > I doubt any system that does not support off_t does support LFS.
> 
> As I mentioned in the first message, LFS is not the only thing you
> need large values for.  Think download quota or the sum of downloaded
> bytes.  You should be able to specify `--quota=10G' on systems without
> LFS.

 Agreed.

> As for the hassle, remember that Wget caters to systems with much less
> features than LFS on a regular basis.  For example, we suppose
> pre-ANSI C compilers, libc's without snprintf, strptime or, for that
> matter, basic C89 functions like memcpy or strstr.  So yes, I'd say
> pre-LFS systems are worth the hassle.

 Well, as always with good software we should assume a standard
environment to be the guideline on how to write code and then add specific
arrangements to deal with various shortcomings of non-standard or simply
old systems.  Autoconf has capabilities for aiding in implementing such a
setup and the shortcomings are typically handled if there's a demand only.

> Perhaps a good compromise would be to use off_t for variables whose
> 64-bitness doesn't matter without LFS, and a `large_number_t' typedef
> that points to either `double' or `long long' for others.  Since the
> others are quite rare, printing them won't be a problem in practice,
> just like it's not for VERY_LONG_TYPE right now.

 off_t *must* be used for file offsets, sizes, etc. of you want to be
sane.  For other uses, I suppose an autoconf test can be written to find
the largest available type.

> > And even if it does, it's probably not worth the hassle.  To handle
> > ordinary old systems, you just call:
> >
> > AC_CHECK_TYPE(off_t, long)
> >
> > before calling AC_SYS_LARGEFILE.
> 
> That still doesn't explain how to print off_t on systems that don't
> natively support it.  (Or that do, for that matter.)

 Sure, that's a completely independent problem.  While discovering the
proper format specifier is doable, gettextization makes its use tough.

  Maciej

-- 
+  Maciej W. Rozycki, Technical University of Gdansk, Poland   +
+--+
+e-mail: [EMAIL PROTECTED], PGP key available+



Re: Portable representation of large integers

2003-09-23 Thread DervishD
Hi Hrvoje :)

 * Hrvoje Niksic <[EMAIL PROTECTED]> dixit:
> Using #ifdefs to switch between %d/%lld/%j *is* completely portable,
> but it requires three translations for each message.  The translators
> would feast on my flesh, howling at the moonlight.

Sure ;)) Anyway, it will happen in just a couple of places. I
mean, it's not like triplicating the number of messages...

> Hmm.  How about preprocessing the formats before passing them to
> printf?

That will help with the translation, obviously, but as I said
above, if the number of affected messages is short...

> > That's the better I can get, because when I wrote portable code, by
> > portable I understand 'according to standards'. For me that means,
> > in that order: SuSv3, POSIX, C99, C89, stop. No pre-ANSI and no
> > brain damaged compilers.
> I understand your position -- it's perfectly valid, especially when
> you have the privilege of working on a system that supports all
> those standards well.  But many people don't, and Wget (along with
> most GNU software of the era) was written to work for them as well. 

Of course. Moreover, my software is not used for as many users as
wget is. In fact, wget is the standard for URL retrieving under UNIX
(together with curl). It really worths the effort making fully
portable and runnable in as many systems as possible.

I like portability, and in fact I try my best to write my
software portably, and that include, for example, tweaking a bit with
types, not assuming anything, etc... but that doesn't include
bad-behaviour systems: weird libc's, systems where off_t should be 64
but due to some lazy programmer it's 32 by mistake, etc... In other
words, I adhere to standards (even old ones if necessary), but I
don't go out from them for supporting weird systems (pre-ANSI, for
example). I have that vision of portability mainly because non
standard systems have been biting me until I started to use Linux
(1995 or so...).

> For me, portability is not about adhering to standards,
> it's about making programs work in a wide range of environments,
> some of which differ from yours.

I think that standards are good, so my opinion. But I'm with you:
this is not applicable for wget. You are so kind for supporting such
a variety of systems. I wouldn't support them, but fortunately for
wget users, I'm not the maintainer :)) I must confess that lately my
vision of portability have been reduced to 'can compile under Linux
or at least under a SuSv3 or POSIX system'. Please note that this is
not arbitrary: just like you want wget to be able to run in a wide
range of environment, I don't like to punish performance or power of
my software just to make it run in a system I don't feel like
supporting. Anyway, I'm not radical with this: each piece of software
must be dealt with in its unique way.

Just in case you need, I have the entire ANSI C 99 standard, even
with the rationale, so if you don't have it and want to know
something, please feel free to ask. I'll try to help as much as I
can.
 
> Thanks for your suggestions.

Thanks to you for considering and for wget, really. It saves me a
lot of effort and time when dowloading Linux distros :)))

Raúl Núñez de Arenas Coronado

-- 
Linux Registered User 88736
http://www.pleyades.net & http://raul.pleyades.net/


Getting wget to use .listing but not update missing local files

2003-09-23 Thread Kjell Grindalen
Hello

I am using wget to download only new/modified files from an website where I
cant delete files.

I am using wget like this:

Wget -m -nv -nH -P /local/file/dir -a /var/adm/messages
ftp://user:[EMAIL PROTECTED]

Everything works great, except that I need to process the files I am getting
locally, and then push them to another processing server

So I would like to use the processing script to delete the local files where
wget is running and have wget not retrieve the files one more time..since I
can delete them at ftp.site.com

Any advice ? Can I make wget just look at its .listing file ?

Mvh
 
Kjell Grindalen
SIO/IT
2259 6863 / 95095311


Re: Portable representation of large integers

2003-09-23 Thread Hrvoje Niksic
DervishD <[EMAIL PROTECTED]> writes:

> Yes, you're true, but... How about using C99 large integer types
> (intmax_t and family)?

But then I can use `long long' just as well, which is supported by C99
and (I think) required to be at least 64 bits wide.  Portability is
the whole problem, so suggestions that throw portability out the
window aren't telling me anything new.

Using #ifdefs to switch between %d/%lld/%j *is* completely portable,
but it requires three translations for each message.  The translators
would feast on my flesh, howling at the moonlight.

Hmm.  How about preprocessing the formats before passing them to
printf?  For example, always use "%j" in strings, like this:

printf (FORMAT (_("whatever %j\n")), num);

On systems that support %j, FORMAT would be defined to no-op.
Otherwise, it would be defined to a format_transform function that
converts %j to either %lld or %.0f, depending on whether the system
has long long or not (in which case it would use double for large
quantities).

> That's the better I can get, because when I wrote portable code, by
> portable I understand 'according to standards'. For me that means,
> in that order: SuSv3, POSIX, C99, C89, stop. No pre-ANSI and no
> brain damaged compilers.

I understand your position -- it's perfectly valid, especially when
you have the privilege of working on a system that supports all those
standards well.  But many people don't, and Wget (along with most GNU
software of the era) was written to work for them as well.  I don't
want to support only POSIX systems for the same reason I don't want to
support only the GNU system or only the Microsoft systems.  For me,
portability is not about adhering to standards, it's about making
programs work in a wide range of environments, some of which differ
from yours.

Thanks for your suggestions.


Re: Portable representation of large integers

2003-09-23 Thread DervishD
Hi Hrvoje :))

 * Hrvoje Niksic <[EMAIL PROTECTED]> dixit:
> There have been several attempts to fix this:
[...]
> * In its own patches, Debian introduced the use of large file APIs and
>   `long long'.  While that's perfectly fine for Debian, it is not
>   portable.  Neither the large file API nor `long long' are
>   universally available, and both need thorough configure checking.

Yes, you're true, but... How about using C99 large integer types
(intmax_t and family)? I know, is not universally available, but if
you support obsolete systems IMHO you should do in a last resort way.
I mean, use large file API or intmax_t on every system that supports
it (all systems with a not-so-recent GNU C compiler), long long on
the rest and long as the last resort. It's not very complicated to do
so in autoshi^H^H^Hconf, and wget will support really huge files in
most of systems (those with a GNU C compiler and those with long long
support), and normal files on the rest.

> Of those two issues, choosing and using the numeric type is the hard
> one.  Autoconf helps only to an extent -- even if you define your own
> `large_number_t' typedef, which is either `long' or `long long', the
> question remains how to print that number.

If you use intmax_t, the problem is solved because C99 has a
%-format for them. If you use long long and gcc, the same happens
AFAIK. The problem is that you don't have an universal format for all
those types... You should write a printf wrapper or a special
function for printing those numbers.

> 1. Concatenation of adjacent string literals is an ANSI feature and
>would break pre-ANSI compilers.

You're true again, but you can wrap it in a macro and use '##'.
Or you can stop supporting pre-ANSI compilers. How many users will
this affect (more or less).

> 2. It breaks gettext.  With translation support, the above code would
>look like this:

Yes, gettext is far from perfect : The solution is to do the
conditional preprocessing above directly on the code, so you have
three (or more, as needed) printf lines with legal-gettext code in
them:

#ifdef CRAP_SYSTEM
printf(_("Whatever %d\n"), num);
#elif defined A_BIT_BETTER_SYSTEM
printf(_("Whatever %lld\n", num);
#else
/* We have a good system */
printf(_("Whatever %j\n", num);
#endif

Yes, looks ugly, is crap, but... 

> The bottom line is, I really don't know how to solve this portably.

IMHO, the best solution is to use a big unsigned type for sums,
totals, sizes to print, etc... and use a large file API for reading
and write large files. If the large file API doesn't exist, use off_t
anyway. Provide a function for printing off_t's and just make sure
that the type for sums is larger than off_t if possible, otherwise
use off_t too (I suppose that autoconf can help you in this).

Normally I use 'size_t' or 'uintmax_t' for sizes to print, sums,
totals, etc... and off_t for files, because an off_t, being signed,
will be shorter than the uintmax type for sure, although you can
always use bignum libraries. The bignum libraries (well, pick up just
the code for additions, for example) will help you with half of the
problem. The other half, reading/writing large files, must be
provided by the system. Just define _LARGEFILE64 (or whatever is the
macro called in SuSv3 or POSIX) and use off_t and an small function
for printing them (although off_t is opaque and if you want full
portability you should not use it directly...).

That's the better I can get, because when I wrote portable code,
by portable I understand 'according to standards'. For me that means,
in that order: SuSv3, POSIX, C99, C89, stop. No pre-ANSI and no brain
damaged compilers.

> Does anyone know how widely ported software deals with large files?

No idea, sorry :((( And thanks for the work you're doing with
wget, we all benefit from.

Raúl Núñez de Arenas Coronado

-- 
Linux Registered User 88736
http://www.pleyades.net & http://raul.pleyades.net/