Hello... I have written a small script that uses lynx to retrieve data from an 
external web site, then parse and format parts of that data, finally writing 
out a script for a java based application which displays the data on the 
local web page.  The perl script is called via an #exec cmd in the local html 
code, followed by the  java <script> that acts on the file perl has written.
While this works flawlessly 99% of the time, there are occasions where the 
remote web site is down, or at least very busy and slow to respond.  At these 
times, the loading of the local page is delayed until the remote server is 
connected and the data is received or the natural duration of a time out is 
reached.  Is there a way using lynx, or wget, or some other method to specify 
a timeout so that if the remote site doesn't respond in a specified amount of 
time, the perl script will continue on?
It will be easy enough to have the perl script create a 'data not available' 
sort or message to satisfy the java routine if a timeout occurs and the 
remote data isn't received.  It's a rare occurance, but when it happens I'd 
prefer a 'no data' message as opposed to a delay of up to 2 minutes while the 
local page waits for things to happen.  Here are the lines I presently have 
for retreiving the data:


$lynx = "/usr/bin/lynx";
open(DATAIN,"$lynx -source http://remotesite.com/remotedata|");
chop (@lines = <DATAIN>);

Any suggestions are greatly appreciated!





-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to