Re: wget does not handle sizes over 2GB

2005-01-12 Thread Mauro Tortonesi
Alle 14:33, mercoledì 12 gennaio 2005, hai scritto:
> On Wed, 12 Jan 2005, Wincent Colaiuta wrote:
> > Daniel really needs to do one of two things:
>
> Thanks for telling me what to do.
>
> Your listing wasn't 100% accurate though. Am I not allowed to discuss
> technical solutions for wget if that involves a term from a different Free
> Software project I am involved in? I guess not.
>
> As you can see, in the other four mentionings of your list, I did mention
> the other tool to HELP the users who were asking for features wget doesn't
> provide.
>
> When people ask for stuff on other mailing lists and I know wget can do
> them fine, I usually recommend wget to them. I guess that is stupid too.
>
> Perhaps I'll learn all this when I grow older.

please daniel, don't be offended. i think your behaviour has always been 
correct. sometimes it's easy to misunderstand each other when talking on a 
mailing list.

> This is my last mail here on this topic.

yes, please. let's stop it right now.

-- 
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
Institute of Human & Machine Cognition   http://www.ihmc.us
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Re: wget does not handle sizes over 2GB

2005-01-12 Thread Mauro Tortonesi
Alle 14:06, mercoledì 12 gennaio 2005, Wincent Colaiuta ha scritto:
> El 11/01/2005, a las 17:28, Daniel Stenberg escribió:
> > On Tue, 11 Jan 2005, Leonid wrote:
> >>   curl does not survive losing connection. Since the probability to
> >> lose connection when you download 2Gb+ files is very high even if you
> >> have a fast connection,
> >
> > This mailing list is for wget, not curl. We can talk about what curl
> > does and does not on the curl mailing list.
>
> Here is a list of recent postings to this list by Daniel Stenberg:
>
> 9 January: "Until the situation is changed, I can recommend using curl
> for this kind of transfers. It supports large files on all platforms
> that do."
>
> 1 December: "AFAIK, wget doesn't support it. But curl does:
> curl.haxx.se"
>
> 1 November: "Consider using libcurl"
>
> 1 October: "Until this is implemented, you may find it useful to know
> that curl supports this option"
>
> 10 September: "Allow me to mention that curl groks large files too."
>
> It's very funny that the wget developers have silently tolerated these
> ongoing advertisements for a competing product on the wget list

why shouldn't have we? i don't think "competition" with curl is bad at all! 
ok, curl and wget are two different software with some common features. 
but this is the open source community, not the closed-source software 
business. if curl has a large number of users it does not mean that wget 
cannot have a large user community as well, and viceversa.

instead, i think that by exchanging information and providing help to each 
other the wget and curl developer can make both of the above mentioned 
software better.

> but the *very first time* someone makes a comment about curl that Daniel
> doesn't like, he leaps in and tries to tell us what the list is and
> isn't for.

i am sure daniel didn't want to be rude and anyway he already apologized for 
writing a mail that could be misunderstood.

> For what it's worth, I agree with Leonid. For getting large files or
> files which are likely to require multiple automated retries I've
> always preferred wget.

this is just your personal preference, it does not mean that curl is a bad 
software. not at all!

please, let's stop this discussion before starting a flame war or even an 
unpleasant exchange of personal opinions. this mailing list has been created 
to help wget users (and in this sense, daniel has just given suggestions to 
wget users on how to solve their problems) and to coordinate the development 
effort behind wget. so, let's just use the mailing list for these two 
purposes, ok? ;-)

-- 
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
Institute of Human & Machine Cognition   http://www.ihmc.us
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Re: wget does not handle sizes over 2GB

2005-01-12 Thread Wincent Colaiuta
El 12/01/2005, a las 14:33, Daniel Stenberg escribió:
On Wed, 12 Jan 2005, Wincent Colaiuta wrote:
Daniel really needs to do one of two things:
Thanks for telling me what to do.
I was just pointing out your hypocrisy because I found it offensive. 
When you told Leonid to shut up, did he write back and sarcastically 
thank you for it? No, he didn't. You could learn something from him. So 
here's another piece of advice for you: grow up.

Your listing wasn't 100% accurate though. Am I not allowed to discuss 
technical solutions for wget if that involves a term from a different 
Free Software project I am involved in? I guess not.
You've entirely missed the point of my post.
You said, "This mailing list is for wget, not curl. We can talk about 
what curl does and does not on the curl mailing list." I merely went 
and grabbed a selection of quotes showing you doing the exact opposite 
(talking about what curl does on the wget mailing list).

This strikes me as hypocritical. Basically, according to your world 
view, you are allowed to promote your competing project on the wget 
mailing list through repeated unsolicited, advertisements, and the wget 
developers are expected to silently tolerate it (which is exactly what 
they've done). On the other hand, nobody is allowed to make any 
comments about curl that you don't like, and if they do, you'll invoke 
a double-standard in which curl can't be discussed on the wget mailing 
list.

As you can see, in the other four mentionings of your list, I did 
mention the other tool to HELP the users who were asking for features 
wget doesn't provide.
I don't object to you helping people. I don't even object to you 
mentioning curl on the wget list.

I *strongly* object to you trying to silence the speech and opinions of 
others by invoking a double standard.

When people ask for stuff on other mailing lists and I know wget can 
do them fine, I usually recommend wget to them. I guess that is stupid 
too.
No need to overreact, Daniel. You should obviously recommend the best 
tool for the job.

Perhaps I'll learn all this when I grow older.
This is my last mail here on this topic.
Fantastic!


Re: wget does not handle sizes over 2GB

2005-01-12 Thread Daniel Stenberg
On Wed, 12 Jan 2005, Wincent Colaiuta wrote:
Daniel really needs to do one of two things:
Thanks for telling me what to do.
Your listing wasn't 100% accurate though. Am I not allowed to discuss 
technical solutions for wget if that involves a term from a different Free 
Software project I am involved in? I guess not.

As you can see, in the other four mentionings of your list, I did mention the 
other tool to HELP the users who were asking for features wget doesn't 
provide.

When people ask for stuff on other mailing lists and I know wget can do them 
fine, I usually recommend wget to them. I guess that is stupid too.

Perhaps I'll learn all this when I grow older.
This is my last mail here on this topic.
--
 -=- Daniel Stenberg -=- http://daniel.haxx.se -=-
  ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol


Re: wget does not handle sizes over 2GB

2005-01-12 Thread Wincent Colaiuta
El 11/01/2005, a las 17:28, Daniel Stenberg escribió:
On Tue, 11 Jan 2005, Leonid wrote:
  curl does not survive losing connection. Since the probability to 
lose connection when you download 2Gb+ files is very high even if you 
have a fast connection,
This mailing list is for wget, not curl. We can talk about what curl 
does and does not on the curl mailing list.
Here is a list of recent postings to this list by Daniel Stenberg:
9 January: "Until the situation is changed, I can recommend using curl 
for this kind of transfers. It supports large files on all platforms 
that do."

1 December: "AFAIK, wget doesn't support it. But curl does: 
curl.haxx.se"

1 November: "Consider using libcurl"
1 October: "Until this is implemented, you may find it useful to know 
that curl supports this option"

10 September: "Allow me to mention that curl groks large files too."
It's very funny that the wget developers have silently tolerated these 
ongoing advertisements for a competing product on the wget list, but 
the *very first time* someone makes a comment about curl that Daniel 
doesn't like, he leaps in and tries to tell us what the list is and 
isn't for. In order to be consistent, Daniel really needs to do one of 
two things: (1) either stop plugging curl on the wget list; or (2) stop 
trying to suppress the free speech and opinions of other by enforcing a 
hypocritical double-standard about what can and can't be said.

For what it's worth, I agree with Leonid. For getting large files or 
files which are likely to require multiple automated retries I've 
always preferred wget.


Re: wget does not handle sizes over 2GB

2005-01-11 Thread Mauro Tortonesi
Alle 21:47, martedì 11 gennaio 2005, Daniel Stenberg ha scritto:
> On Tue, 11 Jan 2005, Mauro Tortonesi wrote:
> > oh, come on. let's not fall to the "my software is better than yours"
> > childish attitude.
>
> I'm sorry if it came out that way, it was not my intention. I just wanted
> to address the misinformation posted here.
>
> I have not said and do not think that X is better than Y, just different.
>
> And I have contributed to this project serveral times and I might very well
> continue to do so. I am not just an author of another tool.

i know that, daniel. i just wanted to stop the discussion before it could 
generate a flame war ;-)

-- 
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
Institute of Human & Machine Cognition   http://www.ihmc.us
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Re: wget does not handle sizes over 2GB

2005-01-11 Thread Daniel Stenberg
On Tue, 11 Jan 2005, Mauro Tortonesi wrote:
oh, come on. let's not fall to the "my software is better than yours" 
childish attitude.
I'm sorry if it came out that way, it was not my intention. I just wanted to 
address the misinformation posted here.

I have not said and do not think that X is better than Y, just different.
And I have contributed to this project serveral times and I might very well 
continue to do so. I am not just an author of another tool.

--
 -=- Daniel Stenberg -=- http://daniel.haxx.se -=-
  ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol


Re: wget does not handle sizes over 2GB

2005-01-11 Thread Mauro Tortonesi
Alle 17:28, martedì 11 gennaio 2005, hai scritto:
> On Tue, 11 Jan 2005, Leonid wrote:
> >   curl does not survive losing connection. Since the probability to lose
> > connection when you download 2Gb+ files is very high even if you have a
> > fast connection,
>
> This mailing list is for wget, not curl. We can talk about what curl does
> and does not on the curl mailing list.

now, that's very funny indeed. aren't you the guy that keeps answering "wget 
doesn't do that, but you can use curl instead" when someone asks on this list 
for a feature wget does not support yet? ;-)

> It _does_ support request retrying and it does support download resuming - 
> it is however not a wget clone. Besides, curl already supports large file 
> transfers portably. Wget does not.

oh, come on. let's not fall to the "my software is better than yours" childish 
attitude. i have a very good opinion of curl, and the only reason for which i 
don't use it is that i have been using wget for more than 7 years and it has 
always satisfied all my needs. well, i also don't share your negative opinion 
on GPL, but that's just a personal point of view and has nothing to do with 
the (very good) quality of your software.

this said, i don't have anything against sharing information and talking about 
design and programming practices with you and the curl developers in this or 
in the curl mailing list UNLESS THIS GENERATES STUPID FLAME WARS. i don't 
like "my software is better than yours" flame wars. i think they're pathethic 
and i will not tolerate them on this list. instead, i strongly encourage 
positive information and experience sharing, so that we could work on both 
the wget and curl software and make this already excellent tools even better.

i hope you agree with me on this point, daniel.

> In general, TCP connections _are not_ likely to disconnect no matter how
> much data you transfer, it is only likely to happen if you have shaky
> network setup.

very right. but this does not mean that TCP connections don't break ;-)

-- 
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
Institute of Human & Machine Cognition   http://www.ihmc.us
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Re: wget does not handle sizes over 2GB

2005-01-11 Thread Leonid
Daniel,
  I apologize if I hurt your feeling about curl. Last summer I had
to download several 10Gb+ files and I tried to use curl and ncftp.
After a day or so of work curl was stopping, freezing forever and
I was unable to force it to retry and to resume. Maybe I misused curl,
did not understand documentation or I am a stupid guy. The bottom line
is I was unable to download the data. I had a dilemma to order the
tapes and pay out of my pocket several hundred bucks or to patch wget.
  So I had an impression that if you have an unstable connection
(and I do, and majority of users do), which is dropping many times
during download, there is no alternative as to use wget. The longer
the file, the longer download time, the greater probability that the
connection will be lost. Yes, you are right, this mailing list is not
about curl, but IMHO the main virtue of wget with respect a myriad of
other clients, including curl, is that it works reliably at extremely
lousy and unreliable networks.
Leonid


Re: wget does not handle sizes over 2GB

2005-01-11 Thread Daniel Stenberg
On Tue, 11 Jan 2005, Leonid wrote:
  curl does not survive losing connection. Since the probability to lose 
connection when you download 2Gb+ files is very high even if you have a fast 
connection,
This mailing list is for wget, not curl. We can talk about what curl does and 
does not on the curl mailing list. It _does_ support request retrying and it 
does support download resuming - it is however not a wget clone. Besides, curl 
already supports large file transfers portably. Wget does not.

In general, TCP connections _are not_ likely to disconnect no matter how much 
data you transfer, it is only likely to happen if you have shaky network 
setup.

--
 -=- Daniel Stenberg -=- http://daniel.haxx.se -=-
  ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol


wget does not handle sizes over 2GB

2005-01-11 Thread Leonid
Denis,
   curl does not survive losing connection. Since the probability to
lose connection when you download 2Gb+ files is very high even if you
have a fast connection, and since I had to download 10+ Gb datasets
(not DVDs: data) routinely, nothing remained unless to patch wget.
You can get the patched version in http://software.lpetrov.net/wget-LFS/
Leonid


Re: wget does not handle sizes over 2GB

2005-01-09 Thread Daniel Stenberg
On Sun, 9 Jan 2005, Denis Doroshenko wrote:
*size = strtol (respline + 4, NULL, 0);
where size is defined as "long int *" in the function's declaration. BTW. 
why's the base given to strtol is "0", not "10"? isn't that too flexible for 
a defined protocol?
Yes it is, SIZE returns a base-10 number.
The limitation seems to be hardwired throughout the source code, so it is 
not that simple to me
There is at least one (platform specific) patch floating around that 
introduces large file support to wget on Linux.

Until the situation is changed, I can recommend using curl for this kind of 
transfers. It supports large files on all platforms that do.

Having done lots of the adjustments in the curl code, I have to admit that the 
work (the transition to portable large file support) wasn't _that_ hard once 
we actually started working on it.

--
 -=- Daniel Stenberg -=- http://daniel.haxx.se -=-
  ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol


wget does not handle sizes over 2GB

2005-01-09 Thread Denis Doroshenko
hello,

being meant to be such a powerful download tool, it seems to be at
least confusing:

file src/ftp-basic.c (function ftp_size, 1.9+cvs-dev, line 1153):

*size = strtol (respline + 4, NULL, 0);

where size is defined as "long int *" in the function's declaration.
BTW. why's the base given to strtol is "0", not "10"? isn't that too
flexible for a defined protocol?

the http.c has the same problem, where content length is defined as
"long contlen;"  and is calculated with the same "strtol (hdrval,
NULL, 10)".

The limitation seems to be hardwired throughout the source code, so it
is not that simple to me (being not familiar with the source) to
provide any diff. Having DVD ISO's becoming more usual, that are not
only bigger than 2 GB, but even bigger than 4 GB, this seems to be a
serious problem. As the example I am providing a debug output of wget
1.9.1 downloading file which size is 4 683 900 928 (yes, more than
4GB, 'tis a DVD ISO). IP addresses, pathnames and filenames are
modified (sorry), but this has nothing to do with the problem. Notice
the size returned to "SIZE" command and the size that wget reports as
"unauthoritative"... I am really unsure of what wget will do, when it
reaches the size that is "unauthoritative", will it show negative
values (like mozilla's ftp client)?..

C:\Documents and Settings\user\My Documents\Archive\Download>wget -c -d
ftp://xxx.yyy.zzz/apps/All.XXX.DVD/DVD1/all_xxx_dvd_1.iso

DEBUG output created by Wget 1.9.1 on Windows.

set_sleep_mode(): mode 0x8001, rc 0x8000
--18:13:28--  ftp://xxx.yyy.zzz/apps/All.XXX.DVD/DVD1/all_xxx_dvd_1.iso
   => `all_xxx_dvd_1.iso'
Resolving xxx.yyy.zzz... seconds 0.00, aaa.bbb.ccc.ddd
Caching xxx.yyy.zzz => aaa.bbb.ccc.ddd
Connecting to xxx.yyy.zzz[aaa.bbb.ccc.ddd]:21... seconds 0.00, connected.
Created socket 724.
Releasing 008928A0 (new refcount 1).
Logging in as anonymous ... 220 guess my name

--> USER anonymous

331 Please specify the password.

--> PASS -wget@

230 Login successful.
Logged in!
==> SYST ...
--> SYST

215 UNIX Type: L8
done.==> PWD ...
--> PWD

257 "/"
done.
==> TYPE I ...
--> TYPE I

200 Switching to Binary mode.
done.  changing working directory
Prepended initial PWD to relative path:
   pwd: '/'
   old: 'apps/All.XXX.DVD/DVD1'
  new: '/apps/All.XXX.DVD/DVD1'
==> CWD /apps/All.XXX.DVD/DVD1 ...
--> CWD /apps/All.XXX.DVD/DVD1

250 Directory successfully changed.
done.
==> SIZE all_xxx_dvd_1.iso ...
--> SIZE all_xxx_dvd_1.iso

213 4683900928
done.
==> PORT ... Master socket fd 712 bound.
using port 4027.

--> PORT eee,fff,ggg,hhh,15,187

200 PORT command successful. Consider using PASV.
done.==> REST 20689712 ...
--> REST 20689712

350 Restart position accepted (20689712).
done.
==> RETR all_xxx_dvd_1.iso ...
--> RETR all_xxx_dvd_1.iso

150 Opening BINARY mode data connection for all_xxx_dvd_1.iso
(4683900928 bytes).
done.
Created socket fd 704.
Length: 388,933,632 [368,243,920 to go] (unauthoritative)

 5% [+>   ] 22,811,09237.54K/s  ETA 2:25:59