El 11/01/2005, a las 17:28, Daniel Stenberg escribió:

On Tue, 11 Jan 2005, Leonid wrote:

curl does not survive losing connection. Since the probability to lose connection when you download 2Gb+ files is very high even if you have a fast connection,

This mailing list is for wget, not curl. We can talk about what curl does and does not on the curl mailing list.

Here is a list of recent postings to this list by Daniel Stenberg:

9 January: "Until the situation is changed, I can recommend using curl for this kind of transfers. It supports large files on all platforms that do."

1 December: "AFAIK, wget doesn't support it. But curl does: curl.haxx.se"

1 November: "Consider using libcurl"

1 October: "Until this is implemented, you may find it useful to know that curl supports this option"

10 September: "Allow me to mention that curl groks large files too."

It's very funny that the wget developers have silently tolerated these ongoing advertisements for a competing product on the wget list, but the *very first time* someone makes a comment about curl that Daniel doesn't like, he leaps in and tries to tell us what the list is and isn't for. In order to be consistent, Daniel really needs to do one of two things: (1) either stop plugging curl on the wget list; or (2) stop trying to suppress the free speech and opinions of other by enforcing a hypocritical double-standard about what can and can't be said.

For what it's worth, I agree with Leonid. For getting large files or files which are likely to require multiple automated retries I've always preferred wget.

Reply via email to