I think the issue is that System.currentTimeMillis() isn't the most exact way to 
detemine any type of real benchmark. I think it's probably something with either lots 
of threads and/or garbage collection and/or something do with so many network 
connection processing simultaneous on your system.
 
3000 ms is still only 3 seconds. And since you said you're just trying to monitor 
whether sites are up or not and not peformance, I think you're trying to make an issue 
where there isn't one. 
 
Just set your timeout to be say 60 seconds and things should be ok :).
 
Mark

        -----Original Message----- 
        From: Michael Mattox [mailto:[EMAIL PROTECTED] 
        Sent: Thu 7/3/2003 1:47 PM 
        To: [EMAIL PROTECTED] 
        Cc: 
        Subject: Occassional long download times
        
        

        I'm experiencing something weird and I just want to see if anyone else has
        experienced it, and if it may be something I'm doing.  Basically my
        application is monitoring 700+ websites every 5 minutes and timing the time
        it takes to connect and download.  The main goal is to verify the site is
        working, so I don't need exact precision on the times.  Here's some of my
        code to time the download:
        
        method = new GetMethod(uri.toString());
        method.setFollowRedirects(true);
        method.setHttp11(false);
        DefaultMethodRetryHandler retry = new DefaultMethodRetryHandler();
        retry.setRequestSentRetryEnabled(true);
        retry.setRetryCount(3);
        method.setMethodRetryHandler(retry);
        
        start = System.currentTimeMillis();
        method.execute(state, connection);
        msi.setDuration(System.currentTimeMillis() - start);
        
        
        What I see is that normally I get download times 150ms and then
        occassionally (4-5 times a day) I see a download time of 3000ms.  It happens
        to the majority of the websites, so I do not believe it's a particular site.
        So it must be either my application, or the network.  My application uses a
        thread pool and always has multiple threads running (typically 8 at a time
        on a 4 CPU machine that's also running tomcat and Postgres), and I've seen
        that at exactly the same time a website has a 3000ms download time several
        others have normal 150ms times.  So this seems to rule out the network.  I
        set my threads to be all MAX_PRIORITY to minimize the interruptions.  Are
        there any other explanations?  Any ideas what I can do about it?  My current
        thought is to put in some code to say if the download time is more than 10x
        the previous time then repeat the download to make sure.  This way our
        customers wouldn't see the huge spike in the numbers, but at the same time
        if that spike really should be there I don't want to cover it up.
        
        Thanks,
        Michael
        
        
        
        ---------------------------------------------------------------------
        To unsubscribe, e-mail: [EMAIL PROTECTED]
        For additional commands, e-mail: [EMAIL PROTECTED]
        
        

Reply via email to