Always requesting keep-alive is not a bug because, when the Wget
process exits, it automatically closes all TCP connections to the
remote servers.

Now if Wget hangs when talking to a web server that supports
keep-alive connections, it's a different bug.  In that case the
question is not "why does Wget request keep-alive on last file?", but
"why does Wget hang in the first place?"

Do you have a server with which you can repeat this?  Any working
example, either a running server, or a trivial HTTP::Daemon script,
would be appreciated.  For example, it works with Apache 2:

$ wget -S www.apache.org
--17:47:40--  http://www.apache.org/
           => `index.html'
Resolving www.apache.org... 192.87.106.226
Connecting to www.apache.org|192.87.106.226|:80... connected.
HTTP request sent, awaiting response...
  HTTP/1.1 200 OK
  ...
  Server: Apache/2.0.54 (Unix) mod_ssl/2.0.54 OpenSSL/0.9.7a DAV/2 SVN/1.2.0-dev
  Keep-Alive: timeout=5, max=100
  Connection: Keep-Alive
Length: 11,540 (11K) [text/html]

100%[=============================================================>] 11,540     
   42.87K/s

17:47:41 (42.80 KB/s) - `index.html' saved [11540/11540]

$ 


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to