Alle 14:06, mercoledì 12 gennaio 2005, Wincent Colaiuta ha scritto:
> El 11/01/2005, a las 17:28, Daniel Stenberg escribió:
> > On Tue, 11 Jan 2005, Leonid wrote:
> >>   curl does not survive losing connection. Since the probability to
> >> lose connection when you download 2Gb+ files is very high even if you
> >> have a fast connection,
> >
> > This mailing list is for wget, not curl. We can talk about what curl
> > does and does not on the curl mailing list.
>
> Here is a list of recent postings to this list by Daniel Stenberg:
>
> 9 January: "Until the situation is changed, I can recommend using curl
> for this kind of transfers. It supports large files on all platforms
> that do."
>
> 1 December: "AFAIK, wget doesn't support it. But curl does:
> curl.haxx.se"
>
> 1 November: "Consider using libcurl"
>
> 1 October: "Until this is implemented, you may find it useful to know
> that curl supports this option"
>
> 10 September: "Allow me to mention that curl groks large files too."
>
> It's very funny that the wget developers have silently tolerated these
> ongoing advertisements for a competing product on the wget list

why shouldn't have we? i don't think "competition" with curl is bad at all! 
ok, curl and wget are two different software with some common features. 
but this is the open source community, not the closed-source software 
business. if curl has a large number of users it does not mean that wget 
cannot have a large user community as well, and viceversa.

instead, i think that by exchanging information and providing help to each 
other the wget and curl developer can make both of the above mentioned 
software better.

> but the *very first time* someone makes a comment about curl that Daniel
> doesn't like, he leaps in and tries to tell us what the list is and
> isn't for.

i am sure daniel didn't want to be rude and anyway he already apologized for 
writing a mail that could be misunderstood.

> For what it's worth, I agree with Leonid. For getting large files or
> files which are likely to require multiple automated retries I've
> always preferred wget.

this is just your personal preference, it does not mean that curl is a bad 
software. not at all!

please, let's stop this discussion before starting a flame war or even an 
unpleasant exchange of personal opinions. this mailing list has been created 
to help wget users (and in this sense, daniel has just given suggestions to 
wget users on how to solve their problems) and to coordinate the development 
effort behind wget. so, let's just use the mailing list for these two 
purposes, ok? ;-)

-- 
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi

University of Ferrara - Dept. of Eng.    http://www.ing.unife.it
Institute of Human & Machine Cognition   http://www.ihmc.us
Deep Space 6 - IPv6 for Linux            http://www.deepspace6.net
Ferrara Linux User Group                 http://www.ferrara.linux.it

Reply via email to