-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
moisi wrote:
> Hello,
>
> I use wget to retrieve an XML feed. The problem is that sometimes I
> get timeout error. I used the good wget options to handle this
> problem, but when wget retries to download the file again, the data
&g
Hello,
I use wget to retrieve an XML feed. The problem is that sometimes I get timeout
error. I used the good wget options to handle this problem, but when wget
retries to download the file again, the data are appended to the file which
causes a bad file. Here is wget command line:
wget -c
Reynir (Cc'd) writes:
> Wget 1.10.2 on a Windows98 box will occasionally time out resolving a
> host (I've set all time-outs to 45s, being stuck with a modem) and
> hang. The -d switch adds nothing useful.
and has also reproduced this on 1.11.
Is it possible there are issues with TerminateThrea
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Micah Cowan wrote:
> Todd Plessel wrote:
>> Q2. If not, then could the PERL-CGI script be modified to spawn a
>> thread that writes an ack to stderr to keep the httpd from timing-out?
>> If so, can you point me to some sample code?
>
> This would be
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Todd Plessel wrote:
> Q1. Is there a way that I can run wget that somehow avoids this
> timeout. For example, by sending an out-of-band ack to stderr every
> 30 seconds so httpd does not disconnect.
> By out-of-band, I mean it cannot be
Problem:
I'm using
wget -q -T 0 -O - 'http://some.remote.host/cgi-bin/some_script?...'
to access a PERL-CGI script on a remote
computer running Apache httpd that is configured with a 300 second
timeout.
The script sometimes takes more than 300 seconds to begin sending
data (b
=> `elenakosilova.narod.ru/studia3/levinas/l2.htm'
Connecting to elenakosilova.narod.ru[213.180.199.22]:80... connected.
HTTP request sent, awaiting response... 504 Proxy Timeout ( The
connection timed out. For more information about this event, see ISA
Server Help. )
The file is
Hi,
I'm having a problem while downloading from a Microsoft FTP server.
The problem is that the connection is timeout/close while downloading,
then wget retry to download the file, but it receives a "file not found"
error.
Is this problem with the MS server or wget?
Here is t
Oops, the problem reappeared. I compiled debug version and got it
crashed. Here's a screenshot of VS2003 debug session attached. It's
also available by this URL: http://home.shad.pp.ru/tmp/wgetcrash.png
It crashed with commandline:
wget.exe --spider --timeout=30 mail.ru
I'm usin
Hi ! I need wget to quit trying to download a file if it encounters any problem more than 2 times. So I specified a connection timeout and a dns timeout and retry count as 2. This works fine. If connection cannot be established for two tries, wget exits. Now the problem is, if the network
Got stable branch this time. Seems like the problem with dns-timeout
has been fixed.
On 4/6/06, burning shadow <[EMAIL PROTECTED]> wrote:
> I checked out SVN repository for the latest sources and tried to build
> it with VS .NET 2003. I got this error:
>
> utils.c(46) : fatal e
Denis Solovyov <[EMAIL PROTECTED]> writes:
> Is it true or false that if --connect-timeout is set to a value
> larger than timeout implemented by system libraries, it will make no
> sense because system timeout will take precedence (i.e. it will
> happen earlier than wget&
Is it true or false that if --connect-timeout is set to a value larger
than timeout implemented by system libraries, it will make no sense
because system timeout will take precedence (i.e. it will happen
earlier than wget's internal timeout)?
(I believe that it will be the
burning shadow wrote:
wget win32 visual c++ build crashing with --timeout option.
example command line:
wget.exe --timeout=30 --spider mail.ru
Sometimes it works, sometimes it crashes. It seems it crashes triying
to resolve address. I get this output:
=== cut ===
--23:08:45-- http://mail.ru
Hi,
I am using wget to retrieve a large file. Lately after downloading
about 20% I get
"Data connection: Connection timed out; Control connection closed.
Retrying."
The second try of wget fails (no error is reported: exit status is 0).
Second login is successfully, but now the file cannot be foun
Hi there
I have hit a problem with wget (1.10.2). If I am using wget for FTP
protocol, with -w 5m option to get some files from some directory on the
FTP server (BUT NOT from the "/" of the FTP server). When the remote FTP
server has a 3m timeout on the control connection, after the
Thanks for the report; I believe this bug is fixed in Wget's
subversion repository.
Hi
I try to download a ISO from ftp.suse.com and have many timeouts.
Because of this, I have started wget v1.10 with "--read-timeout"
and then a segmentation fault happens.
I have searched for the location and have - hopefully - found it:
8< ---
block, at the end
of the file, for 15 minutes. Figuring that this was a timeout problem of some
form I had a look, and sure enough, there it was, sitting on a select to read
more information. Why would it be doing that I thought?
Digging through the code I came across the following:
In retr.c
I was wrong to say that the timeout does not work. The problem is that the
-O option does not set the number of retries to 1 as the manpage claims.
If you then run with -q, it looks like the timeout is simply not working.
NHA
--
Norman H. AzadianTaegerishalde 13CH-3110 Muensingen
Using wget in a cygwin bash shell from Win2K Pro. All other browsing works fine
with other clients. Where can I start looking? Thanks.
$ wget -d http://www.google.com
DEBUG output created by Wget 1.9.1 on cygwin.
--18:04:18-- http://www.google.com/
=> `index.html'
Resolving www.googl
On Saturday, November 29, 2003 at 4:15:19 PM +0100, Hrvoje Niksic wrote:
> Alain Bench <[EMAIL PROTECTED]> writes:
>> I sometimes seem to be stuck in an overly long (like more than 1
>> hour) timeout on closing connection
> during the kernel close() call? Did you confirm
Alain Bench <[EMAIL PROTECTED]> writes:
> Wget 1.9.1: I sometimes seem to be stuck in an overly long (like
> more than 1 hour) timeout on closing connection, when server went
> down or modem hangup during a read or just before close. I use
> Wget's default timeout (0
Hello,
Wget 1.9.1: I sometimes seem to be stuck in an overly long (like
more than 1 hour) timeout on closing connection, when server went down
or modem hangup during a read or just before close. I use Wget's default
timeout (0, 0, 900), or sometimes --timeout=30 (30, 30, 30), and
under
OK, I see.
But I do not agree.
And I don't think it is a good idea to treat the first download special.
In my opinion, exit status 0 means "everything during the whole
retrieval went OK".
My prefered solution would be to set the final exit status to the highest
exit status of all individual downl
This problem is not specific to timeouts, but to recursive download (-r).
When downloading recursively, Wget expects some of the specified
downloads to fail and does not propagate that failure to the code that
sets the exit status. This unfortunately includes the first download,
which should prob
Hi,
doing the following:
# /tmp/wget-1.9-beta3/src/wget -r --timeout=5 --tries=1
http://weather.cod.edu/digatmos/syn/
--11:33:16-- http://weather.cod.edu/digatmos/syn/
=> `weather.cod.edu/digatmos/syn/index.html'
Resolving weather.cod.edu... 192.203.136.228
Conne
Hi,
please use the wget command as follows:
wget -t 1 -T 1 1.1.1.1
or on any other not accessible address.
After execution of this command I have the following messages:
--08:57:37-- http://1.1.1.1/
=> `index.html'
Connecting to 1.1.1.1:80...
and after a much longer period than 1 se
wget -gON -t3 -N -w60 -T10 -c --passive-ftp ftp://[EMAIL PROTECTED]/lastday/*.*
Wget: BUG: unknown command `timeout', value `10'
Sometimes wget can fall asleep. it would be nice to have normal timeout.
Hello,
I tried to use --timeout parameter with wget but it never changes anything
and the connection keeps pending during a long time (between 3 minutes to
more that 30 minutes on different machines).
Is it a bug?
Thank you in advance
Alex
wget 1.7 on linux rh 7.2
hello,
I have a very simple question.
I am writing a script that utilizes wget of course.
All I want to accomplish is for wget to ,very quickly, exit
when url is unreachable (like a test-bogus one below for e.g.)
None of the following tries (A thru C) and their variati
The --timeout switch doesn't work me. Instead of waiting 5 seconds
as indicated below, Wget uses the default value of 900 seconds. I use
use a dialup service and my script just sits there doing nothing for
15 minutes.
It looks to me like the only time this is a problem is when the site
On Mon, Apr 15, 2002 at 04:20:58AM +0200, Hrvoje Niksic wrote:
>As suggested by Alan E, this patch extends the meaning of "timeout" to
>include DNS lookups. After this patch, I can't think of any network
>operation still allowed to take more than the specified timeou
Here is the much awaited connect timeout patch. Although the alarm()
implementation should be the simplest and most portable one, there is
still a portability catch: longjumping out of a signal handler will on
some OS'es leave the signal blocked. You must either explicitly
unblock it
"Christopher H. Taylor" <[EMAIL PROTECTED]> writes:
> Any ETA on when you're going to add a timeout alarm to the connect()
> function? I'm running 1.8.1 and still have the same problem. Many of
> my applications that utilize wget are time critical and I'
Any ETA on when you're going to add a timeout alarm to the
connect() function? I'm running 1.8.1 and still have the same
problem. Many of my applications that utilize wget are time critical
and I'm anxiously awaiting this fix. Thanks for your reply.
Jamie Zawinski <[EMAIL
Warwick Poole <[EMAIL PROTECTED]> writes:
> I want to set a timeout of 5 seconds on a wget http fetch. I have
> tried -T --timeout etc in the command line and in a .wgetrc
> file. wget does not seem to obey these directives.
You have probably encountered the problem that Wget
Thanks for the report. There are plans to fix that problem for the
next release.
Alan Eldridge <[EMAIL PROTECTED]> writes:
> On Thu, Feb 21, 2002 at 03:19:18PM +0100, Hrvoje Niksic wrote:
>>Jamie Zawinski <[EMAIL PROTECTED]> writes:
>>
>>> Please also set an exit alarm around your calls to connect() based
>>> on the -T option.
>>
>>This is requested frequently. I'll include
On Thu, Feb 21, 2002 at 03:19:18PM +0100, Hrvoje Niksic wrote:
>Jamie Zawinski <[EMAIL PROTECTED]> writes:
>
>> Please also set an exit alarm around your calls to connect() based
>> on the -T option.
>
>This is requested frequently. I'll include it in the next release.
>
>The reason why it's not
Jamie Zawinski <[EMAIL PROTECTED]> writes:
> Please also set an exit alarm around your calls to connect() based
> on the -T option.
This is requested frequently. I'll include it in the next release.
The reason why it's not already there is simply that I was lucky never
to be bitten by that pro
On Thursday 21 February 2002 05:44, Ian Abbott wrote:
> On 21 Feb 2002 at 1:31, Alan Eldridge wrote:
> > You can't get it to work for timing out a socket connection, because
> > that is a bit of code that hasn't been implemented yet.
> >
> > If no one else wants to, I can work up a patch for this
On 21 Feb 2002 at 1:31, Alan Eldridge wrote:
> You can't get it to work for timing out a socket connection, because
> that is a bit of code that hasn't been implemented yet.
>
> If no one else wants to, I can work up a patch for this next week.
> It's pretty standard coding, right out of Stevens
On Wed, Feb 20, 2002 at 10:20:38PM -0800, Partycrew Industries wrote:
>I might be wrong but I believe there is a bug in the
>--timeout=whatever syntax. I just can't get the
>program to obey it under any circumstances, I put in
No, that is not a correct statement. The program obe
I might be wrong but I believe there is a bug in the
--timeout=whatever syntax. I just can't get the
program to obey it under any circumstances, I put in
an ip that i know is non-existant and it takes it
forever to "figure that out". I've even tried
changing it in init.c wi
The -T option seems to apply only to read() -- not to connect().
That's pretty worthless.
If I specify -t1 -t5, and the host is not responding (in connect())
then wget takes a full *15 minutes* to time out. This makes it
somewhat less than useful when you're trying to do smart things
in shell
Hi
I am not on the wget list anymore, so please could somebody reply to me
personally. I have found a few people asking the same question in the list
archives, but not found an answer
I want to set a timeout of 5 seconds on a wget http fetch. I have tried -T
--timeout etc in the command line
In wget 1.8 the call
wget -t 1 --timeout=3 http://195.49.84.148/
does not time out after 3 seconds because connecting to that IP (currently)
hangs. strace output:
socket(PF_INET, SOCK_STREAM, IPPROTO_IP) = 3
connect(3, {sin_family=AF_INET, sin_port=htons(80),
sin_addr=inet_addr
l mantained wget. Last time Dan succeeded in tracking
him down and becoming a mantainer. It should be possible again if
somebody is interested in the job. Hopefully there would be more than
three committers so it's more protected against changing work
situations.
>
> Anyway, if I would write
owing and offering support.
Anyway, if I would write down a patch for this, would the -T timeout value be
the one to use even for connects? "set the read timeout to SECONDS" doesn't
quite sound like what we want here, we want a "set the connect timeout to
SECONDS"... but wou
anagement interface (based on CVS)
> for changing the wget tree.
>
> >$ wget --help
> >GNU Wget 1.7, a non-interactive network retriever.
> >Usage: wget [OPTION]... [URL]...
> >Download:
> > -t, --tries=3DNUMBER set number of retries to NUMBER (0
> >
>$ wget --help
>GNU Wget 1.7, a non-interactive network retriever.
>Usage: wget [OPTION]... [URL]...
>Download:
> -t, --tries=3DNUMBER set number of retries to NUMBER (0
> unlimits).
> -T, --timeout=3DSECONDSset the
om the GNU site.
[List] how can it be added ?
>
> I've discovered that wget doesn't do connection timeouts. That is if the
> host it is trying to connect to cannot be reached for some reason then wget
> simply hangs.
>
> I expected wget to return after T seconds af
On Wed, 7 Nov 2001, Nic Ferrier wrote:
> I've discovered that wget doesn't do connection timeouts. That is if the
> host it is trying to connect to cannot be reached for some reason then
> wget simply hangs.
>
> I expected wget to return after T seconds after specifyin
annot be reached for some reason then wget
simply hangs.
I expected wget to return after T seconds after specifying the timeout
option on the command line but it didn't.
No control of connect timeouts is a serious weakness in a tool designed to
be used for batched downloads... I've had
=[IP]" -t 1 -T 15 --cache=off -r -l 1 --delete-after URL
>
> As I know this options tells wget to:
> - use proxy on [IP]
> - retry only once
> - set timeout to 15 seconds
> - set proxy cache off
> - use one level recursion and delete after download URL
>
> The prob
t to:
- use proxy on [IP]
- retry only once
- set timeout to 15 seconds
- set proxy cache off
- use one level recursion and delete after download URL
The problem is, when proxy is not accessible on [IP] the timeout is 15
minutes, even i set the timeout in command line! As I noticed the timeout
works
--timeout is ignored during the connection to the proxy server. This
means that if I want a 20 second timeout (I do), then it will hang
indefinitely if the timeout is while connecting to the proxy.
Any fixes for this available? Any quick fixes you can send?
Thanks!
Mike
Hello!
Does the timeout option -T work ok in wget 1.6
?
Best Regards
-Johannes Harju-
59 matches
Mail list logo