Eric,

I can only offer some general advice. I do something similar to get
statistics on FTP-transfers. Here are some notes:

1. Since I have multiple events (2 per transfer, i.e. start and end
times) and multiple transfers per session, I define an array to hold the
various times, i.e. @Time.

2. For each event, push(@Time,time);

3. To calculate the elapsed time and transfer rate, where $PDF_File and
$SNI_File contain the filenames of the respective files and the first
time argument is the end-time, the second time argument is the
start-time:

    # Calculate usage statistics.
    my ($PDF_Time, $PDF_Rate) = @{Stats($PDF_File,$Time[1],$Time[0])};
    my ($SNI_Time, $SNI_Rate) = @{Stats($SNI_File,$Time[3],$Time[2])};

4. The subroutine that calculates the statistics:

#-----------------------------------------------------------------------
---#
# Compute transfer statistics.
#
#-----------------------------------------------------------------------
---#
sub Stats($$$)
{
    my $File    = shift;
    my $Time1   = shift;
    my $Time2   = shift;
    my $Hours   = 0;
    my $Minutes = 0;
    my $Time    = $Time1 - $Time2;
    my $Size    = -s $File;
    my $Rate = $Size;
    if ($Time > 0) {$Rate = sprintf('%2.2f',$Size / $Time)}
    $Rate = sprintf('%2.2f',($Rate / 1024));
    $Log->WriteLog("File:      $File",0);
    $Log->WriteLog("Time1:     $Time1",0);
    $Log->WriteLog("Time2:     $Time2",0);
    $Log->WriteLog("File Time: $Time",0);
    $Log->WriteLog("File Size: $Size",0);
    $Log->WriteLog("File Rate: $Rate",0);
    my $Seconds = $Time;
    if ($Time > 3600) {$Hours   = sprintf('%d',$Time / 3600); $Time
= $Time % 3600}
    if ($Time > 60)   {$Minutes = sprintf('%d',$Time / 60);   $Seconds
= $Time % 60}
    $Time =
join(':',Lpad($Hours,'0',2),Lpad($Minutes,'0',2),Lpad($Seconds,'0',2));
    $Log->WriteLog("File Time: $Time",0);
    return([$Time,$Rate]);
}

Note that the above subroutine was designed in mind for transfers that
take less than 24-hours, for transfers that would take longer than that,
some other date/time manipulations may be required.

5. You will probably want to write your own Perl subroutine to actually
perform the copying rather than using the sheel-based copy command as
the overhead of spawn a shell-command may be unpredictable.

6. You might consider deleting each file first if it exists as their
might be less overhead involved in copying over an existing file.

7. There may be other things that I haven't considered.

Dirk Bremer - Systems Programmer II - ESS/AMS - NISC Lake St. Louis MO -
USA Central Time Zone
636-755-2652 fax 636-755-2503

[EMAIL PROTECTED]
www.nisc.coop 
________________________________________________________________________
_________________________________________________

        From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of
Eric Logeson
        Sent: Tuesday, May 24, 2005 10:28
        To: perl-win32-users@listserv.ActiveState.com
        Subject: Measure bandwidth
        
        
        Hello List
         
        I wanted to get some basic bandwidth statistics across a WAN
link.  The approach I think I am going to take is the following:
        Use a perl scipt to read in a bunch of files, determine the
sizes, start a timer (not sure how to do this yet), copy the files
across the link (winxp->win2003 server), stop the timer,
        and finally write out the bytes/sec metric to a log file.  I
planned on scheduling the perl script to run every hour or so for a
week.
         
        Are there modules of interest for this project.  Any suggestions
on the approach?
         
         
        Thanks
        Eric


_______________________________________________
Perl-Win32-Users mailing list
Perl-Win32-Users@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to