Merging Hashes

2004-02-26 Thread Akens, Anthony
Hi,
I'm new to hashes, and I've been playing around with the 
following for a while... I'm just not getting it.

I have two hashes, one containing data read in from a file, 
one with "current" data.  I'd like to merge the two, adding 
any new keys and values into the hash, and for any keys that 
exist in both, I'd like the append the values onto the values 
for the same key in the original hash.

An example...

Hash 1 (from a file)

host1=>"760,760,759"
host2=>"765,760,760"
host5=>"130,200"

Hash 2 (Current)

host1=>"758"
host2=>"760"
host4=>"450"
host5=>"210"

I'd like the merged hash to be:

host1=>"760,760,759,758"
host2=>"765,760,760,760"
host5=>"130,200,210"
host5=>"450"


I'm using an example out of the perl cookbook, and trying to
modify it to do what I want, but it isn't working out...

The three hashes are: 
%host_list   (Hash 1 in the example)
%current_list   (Hash 2 in the example)
%final_list  (The result of the merge)

Here's the code I have so far for the merge (this is borrowed
heavily from the Perl Cookbook):

foreach $item ( \%host_list, \%current_list) {
while (($key, $value) = each %$item) {
if (exists $final_list{$key}) {
$value .= ",$current_list{$key}";
}
$final_list{$key} = $value;

}
}


Can someone point me in the right direction?

- Tony

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




RE: Use of "System"

2004-01-20 Thread Akens, Anthony
Someone kindly pointed out that perldoc is a handy thing.

- Tony

-Original Message-
From: Akens, Anthony 
Sent: Tuesday, January 20, 2004 4:31 PM
To: [EMAIL PROTECTED]
Subject: RE: Use of "System"


Sorry all, I seem to be having problems with our company's 
chosen mail client (Outlook).  Trying this again, hopefuly 
it will preserve my EOL characters this time.

- Tony



I'm attempting to use the following code on an AIX machine to 
monitor the error log (using the errpt command).

I'm sure parts of it are very ugly, but it's (mostly) working.

The part that isn't is the foreach loop.  What it's supposed to 
do is as follows:

#read in the "summary" error report, which is in the following 
#format - 
#IDENTIFIER TIMESTAMP  T C RESOURCE_NAME  DESCRIPTION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120142304 T O OPERATOR   OPERATOR NOTIFICATION

@errors =system("errpt |head -n $diff_t | tail -n $diff");

foreach $line (@errors){

#I only care about the first field.  There must be a better way!
  ($err_num, undef, undef, undef, undef, undef, undef) = split(" ",
$line); 
#Use the errpt command to "expand" the message.
  $message = system ("errpt -j $err_num -a");
#add the message to a variable for all of the messages
  $errlogl .= $message;
}

Instead, when I run the command with an & to put it in the 
background it does the first system line above (@errors=...) to 
standard out.

Any help, or comments on the rest of the script are appreciated.

-Tony

#!/usr/bin/perl -w

use strict;

# This script will run every [interval] and check the error log # for
new entries.  Upon finding them, it will send an email to #
administrators containing a message indicating the change # in errlog
status, as well as the offending lines. my $lc = -1;  #last count my $tc
= -1; #This count my $interval = 30;  #Interval in seconds my $me =
"Hardware error monitoring"; my $hostname; my $os; my $mailto = "root";
my $diff; my $msg; my $page_msg; my $errlogl; my $line; my $err_num; my
@errors; my $diff_t; my $message;

open (UNAME, "uname -a |") or die "Couldn't fork: $!\n";
( $os, $hostname, undef, undef, undef ) = (split " ", ); close
(UNAME);

system ("echo \"$me started.\nThis message goes to $mailto.\" | mail -s
\"Errlog monitoring for $hostname\" $mailto");

while ( 1 ) {
   $tc=`errpt -dH,S,U,O | wc -l`;
   
   if ( $lc == -1 ) {
  $lc=$tc;
   }
   if ( $lc != $tc ) {
  $diff=$tc-$lc;
  $diff_t = $diff + 1;
  $msg="$diff new errors have been found on $hostname";
  $page_msg="$diff new errors have been found on $hostname";
  @errors =system("errpt |head -n $diff_t | tail -n $diff");
  foreach $line (@errors){
  ($err_num, undef, undef, undef, undef, undef, undef) = split("
", $line);
  $message = system ("errpt -j $err_num -a");
  #$message = "Test message\n";
  $errlogl .= $message;
  }
  if ( $tc eq "0" ) {
$msg="$msg\n Errlog was cleared";
  }else{
 #system ("logger $msg");
 $msg=" $msg \n Errlog details below:\n $errlogl \n";
  }
  system ("echo \"$msg\" | mail -s \"Errlog status change on host
$hostname\" $mailto");
   }
   $lc=$tc;
   $errlogl = "";
   sleep $interval;
}

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>




RE: Use of "System"

2004-01-20 Thread Akens, Anthony
Sorry all, I seem to be having problems with our company's 
chosen mail client (Outlook).  Trying this again, hopefuly 
it will preserve my EOL characters this time.

- Tony



I'm attempting to use the following code on an AIX machine to 
monitor the error log (using the errpt command).

I'm sure parts of it are very ugly, but it's (mostly) working.

The part that isn't is the foreach loop.  What it's supposed to 
do is as follows:

#read in the "summary" error report, which is in the following 
#format - 
#IDENTIFIER TIMESTAMP  T C RESOURCE_NAME  DESCRIPTION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120142304 T O OPERATOR   OPERATOR NOTIFICATION

@errors =system("errpt |head -n $diff_t | tail -n $diff");

foreach $line (@errors){

#I only care about the first field.  There must be a better way!
  ($err_num, undef, undef, undef, undef, undef, undef) = split(" ",
$line); 
#Use the errpt command to "expand" the message.
  $message = system ("errpt -j $err_num -a");
#add the message to a variable for all of the messages
  $errlogl .= $message;
}

Instead, when I run the command with an & to put it in the 
background it does the first system line above (@errors=...) to 
standard out.

Any help, or comments on the rest of the script are appreciated.

-Tony

#!/usr/bin/perl -w

use strict;

# This script will run every [interval] and check the error log
# for new entries.  Upon finding them, it will send an email to
# administrators containing a message indicating the change
# in errlog status, as well as the offending lines.
my $lc = -1;  #last count
my $tc = -1; #This count
my $interval = 30;  #Interval in seconds
my $me = "Hardware error monitoring";
my $hostname;
my $os;
my $mailto = "root";
my $diff;
my $msg;
my $page_msg;
my $errlogl;
my $line;
my $err_num;
my @errors;
my $diff_t;
my $message;

open (UNAME, "uname -a |") or die "Couldn't fork: $!\n";
( $os, $hostname, undef, undef, undef ) = (split " ", );
close (UNAME);

system ("echo \"$me started.\nThis message goes to $mailto.\" | mail -s
\"Errlog monitoring for $hostname\" $mailto");

while ( 1 ) {
   $tc=`errpt -dH,S,U,O | wc -l`;
   
   if ( $lc == -1 ) {
  $lc=$tc;
   }
   if ( $lc != $tc ) {
  $diff=$tc-$lc;
  $diff_t = $diff + 1;
  $msg="$diff new errors have been found on $hostname";
  $page_msg="$diff new errors have been found on $hostname";
  @errors =system("errpt |head -n $diff_t | tail -n $diff");
  foreach $line (@errors){
  ($err_num, undef, undef, undef, undef, undef, undef) = split("
", $line);
  $message = system ("errpt -j $err_num -a");
  #$message = "Test message\n";
  $errlogl .= $message;
  }
  if ( $tc eq "0" ) {
$msg="$msg\n Errlog was cleared";
  }else{
 #system ("logger $msg");
 $msg=" $msg \n Errlog details below:\n $errlogl \n";
  }
  system ("echo \"$msg\" | mail -s \"Errlog status change on host
$hostname\" $mailto");
   }
   $lc=$tc;
   $errlogl = "";
   sleep $interval;
}

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




Use of "System"

2004-01-20 Thread Akens, Anthony
I'm attempting to use the following code on an AIX machine to
monitor the error log (using the errpt command).

I'm sure parts of it are very ugly, but it's (mostly) working.

The part that isn't is the foreach loop.  What it's supposed to
do is as follows:

#read in the "summary" error report, which is in the following format -
#IDENTIFIER TIMESTAMP  T C RESOURCE_NAME  DESCRIPTION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143604 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120143504 T O OPERATOR   OPERATOR NOTIFICATION
#AA8AB241   0120142304 T O OPERATOR   OPERATOR NOTIFICATION

@errors =system("errpt |head -n $diff_t | tail -n $diff");

foreach $line (@errors){
#I only care about the first field.  There must be a better way!
  ($err_num, undef, undef, undef, undef, undef, undef) = split(" ",
$line);
#Use the errpt command to "expand" the message.
  $message = system ("errpt -j $err_num -a");
#add the message to a variable for all of the messages
  $errlogl .= $message;
}

Instead, when I run the command with an & to put it in the background
it does the first system line above (@errors=...) to standard out.

Any help, or comments on the rest of the script are appreciated.

-Tony


#!/usr/bin/perl -w

use strict;

# This script will run every [interval] and check the error log
# for new entries.  Upon finding them, it will send an email to
# administrators containing a message indicating the change
# in errlog status, as well as the offending lines.
my $lc = -1;  #last count
my $tc = -1; #This count
my $interval = 30;  #Interval in seconds
my $me = "Hardware error monitoring";
my $hostname;
my $os;
my $mailto = "root";
my $diff;
my $msg;
my $page_msg;
my $errlogl;
my $line;
my $err_num;
my @errors;
my $diff_t;
my $message;

open (UNAME, "uname -a |") or die "Couldn't fork: $!\n";
( $os, $hostname, undef, undef, undef ) = (split " ", );
close (UNAME);

system ("echo \"$me started.\nThis message goes to $mailto.\" | mail -s
\"Errlog monitoring for $hostname\" $mailto");

while ( 1 ) {
   $tc=`errpt -dH,S,U,O | wc -l`;
   
   if ( $lc == -1 ) {
  $lc=$tc;
   }
   if ( $lc != $tc ) {
  $diff=$tc-$lc;
  $diff_t = $diff + 1;
  $msg="$diff new errors have been found on $hostname";
  $page_msg="$diff new errors have been found on $hostname";
  @errors =system("errpt |head -n $diff_t | tail -n $diff");
  foreach $line (@errors){
  ($err_num, undef, undef, undef, undef, undef, undef) = split("
", $line);
  $message = system ("errpt -j $err_num -a");
  #$message = "Test message\n";
  $errlogl .= $message;
  }
  if ( $tc eq "0" ) {
$msg="$msg\n Errlog was cleared";
  }else{
 #system ("logger $msg");
 $msg=" $msg \n Errlog details below:\n $errlogl \n";
  }
  system ("echo \"$msg\" | mail -s \"Errlog status change on host
$hostname\" $mailto");
   }
   $lc=$tc;
   $errlogl = "";
   sleep $interval;
}


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




RE: Microsoft Services for UNIX/LINUX

2004-01-06 Thread Akens, Anthony
Glad you've found a solution.  Just thought I'd drop a note about 
some of the things I've had success with.

We have a few scripts (originally written as shell scripts in csh)
that I've converted to perl.  They reside on a box running AIX 5.1.

On a win2k box I've installed the Windows services for unix 3.0 NFS
gateway, which allows client machines to map to windows shares,
which are just gateways to NFS exports on the AIX host.

The windows clients have activestate perl installed, and are
configured to map the NFS share on log in.  The end result is that
the users can double click on the perl apps without needing to know
they're even touching a *nix box.

 - Tony

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, January 06, 2004 11:17 AM
To: Kevin Old
Cc: [EMAIL PROTECTED]
Subject: Re: Microsoft Services for UNIX/LINUX






Hello,

Putty has this:
Plink (a command-line interface to the PuTTY back ends)


http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html


Thanks to all again.  I just got came across a documentation for Putty,
and it provides an alternative to my original solution.  And it is
free!!!

___

William Ampeh (x3939)
Federal Reserve Board


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




RE: Timing several processes

2003-12-04 Thread Akens, Anthony
AIX 5.1, actually.  Though eventually linux, windows,
and possibly other OS's will be in the mix.

I'm writing this with the idea of it being very "modular"
in that each server will do it's own "check" ever 15
minutes or so, and that the webserver will only "connect"
and grab that data when someone goes to the page (using
a cgi to parse it up and display it in a heirarchical
fashion).  The server will access the data via NFS (the
NFS exports are already in place due to another project)

Writing to a file gives me a history of data should any
individual box go down.  (Especially the "webserver" in
this case, since it is periodically taken offline during
the course of any given week due to its role in the
overall project these machines run)

I also plan on the actual programs that are called to be
set up in a config file.  Something along the lines of:

APP1_HANDLE = "sar"
APP1_EXEC = "sar 5 5"
APP1_LOG = "/log/monitor/delta/sar.out"
APP1_PARSE = "/log/monitor/parse_sar.pl"
APP1_SUMMARY = "/log/monitor/delta/sar.summary"

I'm planning on figuring out how to put all of those into
a hash, and then using a foreach loop to exec each one, and
another foreach loop to wait for each to complete, and a
final foreach loop that runs the "parse" for each one and
generates a summary.  The summary files will all be in a
"standard" format that the webserver will use to genrate
its display.

I had not thought of stderr from the commands, so
you're right that catching it is something I need to
think on.

The other bit I'm working on is to make an "options"
file for each server (generated before the summaries
are parsed) that contains info like number of processors,
and tuning options (set with schedtune and vmtune, etc)
that can be used by the "APP_PARSE" scripts in calculating 
results.  (Thrashing detection, etc).

I know, it's all a little complex, but the individual
pieces are fairly simple in design.  And in the end it
will meet the goal that management put forth, which is
what keeps me paid :)

-Tony

-Original Message-
From: drieux [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, December 03, 2003 7:24 PM
To: Perl Perl
Subject: Re: Timing several processes



On Dec 3, 2003, at 10:49 AM, Akens, Anthony wrote:
[..]
> print "Running vmstat\n";
> defined(my $vmstat_pid = fork) or die "Cannot fork: $!"; unless 
> ($vmstat_pid) {
>   exec "vmstat 5 5 > /log/monitor/delta/vmstat.out";
>   die "cannot exec vmstat: $!";
> }
> print "Running sar\n";
> defined(my $sar_pid = fork) or die "Cannot fork: $!";
> unless ($sar_pid) {
>   exec "sar 5 5 > /log/monitor/delta/sar.out";
>   die "cannot exec date: $!";
> }
> print "Waiting...\n";
> waitpid($vmstat_pid, 0);
> waitpid($sar_pid, 0);
> print "done!\n";
[..]

I presume you are working on a solaris box?
have you thought about

timex sar 5 5
timex vmstat 5 5

and you will notice that the sar command will
take about 25 seconds and the vmstat about 20.

but then there is that minor nit about

exec "vmstat 5 5 > /log/monitor/delta/vmstat.out"
or  die "cannot exec vmstat: $!";

since in theory exec WILL not return, so if it failed
why not keep it in the proper context...

Then there is that Minor Nit about not controlling 'stderr' which can
lead to things like:

vladimir: 60:] ./st*.plx
Running vmstat
Running sar
sh: /log/monitor/delta/vmstat.out: cannot create
sh: /log/monitor/delta/sar.out: cannot create
Waiting...
done!
vladimir: 61:]

So while you are in the process of learning
fork() and exec() why not think a bit more
agressively and go with say a pipe to pass
back the information so as not to buy
the IO overhead of writing to files?

While the following was written for a command
line 'let us get interactive' type of solution,
it might be a framework you could rip off
and use:

<http://www.wetware.com/drieux/pbl/Sys/gen_sym_big_dog.txt>



ciao
drieux

---


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Timing several processes

2003-12-03 Thread Akens, Anthony
It seems as if my example was doing what it was intended to do, the
"problem" was in the exec for sar...

It seems that using redirection with the sar command is a little
different then I expected... it does not write anything to the
output file until it is finished.

As for the "Waiting..." line not showing up, when I had done my
testing I had left out the /n  - which caused it not to print until
later.  Once I added the /n (as you see in the code below) it printed
when I expected it to.

Thanks for the help...

-Tony

-Original Message-
From: Akens, Anthony 
Sent: Wednesday, December 03, 2003 1:50 PM
To: Wiggins d Anconia; Tom Kinzer; [EMAIL PROTECTED]
Subject: RE: Timing several processes


Here's the sample code I'm trying...  In essence I would expect to see
The following output:

Running vmstat
Running sar
Waiting...  (at this point a long wait while sar and vmstat finish)
Done!


Instead I am seeing:
Running vmstat
Running sar  (The long wait is here)
Waiting...
Done!


In watching the file sizes, I can see both files are created at the Same
time, but sar does not produce any output in its file until Vmstat
finishes.

-Tony

#!/usr/bin/perl -w

use strict;

print "Running vmstat\n";
defined(my $vmstat_pid = fork) or die "Cannot fork: $!";
unless ($vmstat_pid) {
  exec "vmstat 5 5 > /log/monitor/delta/vmstat.out";
  die "cannot exec vmstat: $!";
}
print "Running sar\n";
defined(my $sar_pid = fork) or die "Cannot fork: $!";
unless ($sar_pid) {
  exec "sar 5 5 > /log/monitor/delta/sar.out";
  die "cannot exec date: $!";
}
print "Waiting...\n";
waitpid($vmstat_pid, 0);
waitpid($sar_pid, 0);
print "done!\n";

-----Original Message-
From: Wiggins d Anconia [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, December 03, 2003 1:31 PM
To: Akens, Anthony; Tom Kinzer; [EMAIL PROTECTED]
Subject: RE: Timing several processes


I was going to suggest POE as well, 'til I saw that little word 'simple'
:-)...

Have you read:

perldoc perlipc
perldoc -f fork
perldoc -f wait
perldoc -f waitpid

Of course POE is what makes keeping track of all those spun off
processes trivial, but learning it is I will admit not trivial...

http://danconia.org

> I already have some ideas for how I want to build the page, how to 
> parse the data I will generate, etc.
> 
> As I said, I've looked at some of the other tools out there, and want 
> to stick to some simple perl code to parse out the information and 
> return the results.
> 
> The only bit I'm not sure of is how to tell if all forked processes 
> have completed before moving on.
> 
> 
> -Tony
> 
> -Original Message-
> From: Tom Kinzer [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, December 03, 2003 12:35 PM
> To: [EMAIL PROTECTED]
> Subject: RE: Timing several processes
> 
> 
>  http://poe.perl.org
> 
> Maybe this would be a good job for POE?
> 
> -Tom Kinzer
> 
> 
> -Original Message-
> From: Akens, Anthony [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, December 03, 2003 7:49 AM
> To: [EMAIL PROTECTED]
> Subject: Timing several processes
> 
> 
> Hi all!
> 
> I'm wanting to write a simple web-based tool to see the status of
> several servers at a glance.  I know there are many solutions 
> existing, but I can't learn as much about perl by just using one of 
> those as I can by writing my own.  The first step I want to do is call

> a script from cron that runs several basic monitoring tools (sar,
> vmstat, df, iostat, etc) and saves the output of each to a file. Then 
> I'd parse those files up, and write a summary file.
> 
> Easy enough.  And I could certainly do it with by calling the tools
> one at a time.  However, I'd like to get roughly 1 minute of vmstat, 
> iostat, and sar output Simultaneously.  So I'm supposing I'd want 
> to fork off each process, and then when those are all done come back 
> and run a script that then parses those results out for the individual

> statistics I'm looking for.
> 
> I've never used fork before, and while it looks fairly straight
> forward what I am not sure of is how to make sure all of those forked 
> processes have completed before moving on and parsing the files.
> 
> Any pointers?
> 
> Thanks in advance
> 
> -Tony
> 


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Timing several processes

2003-12-03 Thread Akens, Anthony
Here's the sample code I'm trying...  In essence I would expect to see
The following output:

Running vmstat
Running sar
Waiting...  (at this point a long wait while sar and vmstat finish)
Done!


Instead I am seeing:
Running vmstat
Running sar  (The long wait is here)
Waiting...
Done!


In watching the file sizes, I can see both files are created at the
Same time, but sar does not produce any output in its file until
Vmstat finishes.

-Tony

#!/usr/bin/perl -w

use strict;

print "Running vmstat\n";
defined(my $vmstat_pid = fork) or die "Cannot fork: $!";
unless ($vmstat_pid) {
  exec "vmstat 5 5 > /log/monitor/delta/vmstat.out";
  die "cannot exec vmstat: $!";
}
print "Running sar\n";
defined(my $sar_pid = fork) or die "Cannot fork: $!";
unless ($sar_pid) {
  exec "sar 5 5 > /log/monitor/delta/sar.out";
  die "cannot exec date: $!";
}
print "Waiting...\n";
waitpid($vmstat_pid, 0);
waitpid($sar_pid, 0);
print "done!\n";

-Original Message-----
From: Wiggins d Anconia [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, December 03, 2003 1:31 PM
To: Akens, Anthony; Tom Kinzer; [EMAIL PROTECTED]
Subject: RE: Timing several processes


I was going to suggest POE as well, 'til I saw that little word 'simple'
:-)...

Have you read:

perldoc perlipc
perldoc -f fork
perldoc -f wait
perldoc -f waitpid

Of course POE is what makes keeping track of all those spun off
processes trivial, but learning it is I will admit not trivial...

http://danconia.org

> I already have some ideas for how I want to build the page, how
> to parse the data I will generate, etc.
> 
> As I said, I've looked at some of the other tools out there,
> and want to stick to some simple perl code to parse out the 
> information and return the results.
> 
> The only bit I'm not sure of is how to tell if all forked processes
> have completed before moving on.
> 
> 
> -Tony
> 
> -Original Message-
> From: Tom Kinzer [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, December 03, 2003 12:35 PM
> To: [EMAIL PROTECTED]
> Subject: RE: Timing several processes
> 
> 
>  http://poe.perl.org
> 
> Maybe this would be a good job for POE?
> 
> -Tom Kinzer
> 
> 
> -Original Message-
> From: Akens, Anthony [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, December 03, 2003 7:49 AM
> To: [EMAIL PROTECTED]
> Subject: Timing several processes
> 
> 
> Hi all!
> 
> I'm wanting to write a simple web-based tool to see the status of 
> several servers at a glance.  I know there are many solutions 
> existing, but I can't learn as much about perl by just using one of 
> those as I can by writing my own.  The first step I want to do is call

> a script from cron that runs several basic monitoring tools (sar, 
> vmstat, df, iostat, etc) and saves the output of each to a file. Then 
> I'd parse those files up, and write a summary file.
> 
> Easy enough.  And I could certainly do it with by calling the tools 
> one at a time.  However, I'd like to get roughly 1 minute of vmstat, 
> iostat, and sar output Simultaneously.  So I'm supposing I'd want 
> to fork off each process, and then when those are all done come back 
> and run a script that then parses those results out for the individual

> statistics I'm looking for.
> 
> I've never used fork before, and while it looks fairly straight 
> forward what I am not sure of is how to make sure all of those forked 
> processes have completed before moving on and parsing the files.
> 
> Any pointers?
> 
> Thanks in advance
> 
> -Tony
> 


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Timing several processes

2003-12-03 Thread Akens, Anthony
To clarify, I found the following in "Learning Perl".

It doesn't quite make sense to me, though.  Perhaps someone
Can clarify what's going on in this?

defined(my $pid = fork) or die "Cannot fork: $!";
unless ($pid) {
  # Child process is here
  exec "date";
  die "cannot exec date: $!";
}
# Parent process is here
waitpid($pid, 0);


It seems I could use that code, modified to start (5) processes,
And a waitpid() for each of those (5).  It doesn't matter
Which processes would finish first.  (I put the 5 in quotes,
Because right now I'm looking at using 5 different tools that
I want to monitor).

-Tony



-Original Message-
From: Akens, Anthony 
Sent: Wednesday, December 03, 2003 12:58 PM
To: Tom Kinzer; [EMAIL PROTECTED]
Subject: RE: Timing several processes


I already have some ideas for how I want to build the page, how 
to parse the data I will generate, etc.

As I said, I've looked at some of the other tools out there, 
and want to stick to some simple perl code to parse out the 
information and return the results.

The only bit I'm not sure of is how to tell if all forked processes 
have completed before moving on.


-Tony

-Original Message-
From: Tom Kinzer [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, December 03, 2003 12:35 PM
To: [EMAIL PROTECTED]
Subject: RE: Timing several processes


 http://poe.perl.org 

Maybe this would be a good job for POE?

-Tom Kinzer


-Original Message-
From: Akens, Anthony [mailto:[EMAIL PROTECTED]
Sent: Wednesday, December 03, 2003 7:49 AM
To: [EMAIL PROTECTED]
Subject: Timing several processes


Hi all!

I'm wanting to write a simple web-based tool to see the status of
several servers at a glance.  I know there are many solutions existing,
but I can't learn as much about perl by just using one 
of those as I can by writing my own.  The first step I want to do 
is call a script from cron that runs several basic monitoring tools 
(sar, vmstat, df, iostat, etc) and saves the output of each to a 
file. Then I'd parse those files up, and write a summary file.

Easy enough.  And I could certainly do it with by calling the tools one
at a time.  However, I'd like to get roughly 1 minute of vmstat, 
iostat, and sar output Simultaneously.  So I'm supposing I'd 
want to fork off each process, and then when those are all done 
come back and run a script that then parses those results out for 
the individual statistics I'm looking for.

I've never used fork before, and while it looks fairly straight forward
what I am not sure of is how to make sure all of those forked 
processes have completed before moving on and parsing the files.

Any pointers?

Thanks in advance

-Tony

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Timing several processes

2003-12-03 Thread Akens, Anthony
I already have some ideas for how I want to build the page, how 
to parse the data I will generate, etc.

As I said, I've looked at some of the other tools out there, 
and want to stick to some simple perl code to parse out the 
information and return the results.

The only bit I'm not sure of is how to tell if all forked processes 
have completed before moving on.


-Tony

-Original Message-
From: Tom Kinzer [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, December 03, 2003 12:35 PM
To: [EMAIL PROTECTED]
Subject: RE: Timing several processes


 http://poe.perl.org 

Maybe this would be a good job for POE?

-Tom Kinzer


-Original Message-----
From: Akens, Anthony [mailto:[EMAIL PROTECTED]
Sent: Wednesday, December 03, 2003 7:49 AM
To: [EMAIL PROTECTED]
Subject: Timing several processes


Hi all!

I'm wanting to write a simple web-based tool to see the status of
several servers at a glance.  I know there are many solutions existing,
but I can't learn as much about perl by just using one 
of those as I can by writing my own.  The first step I want to do 
is call a script from cron that runs several basic monitoring tools 
(sar, vmstat, df, iostat, etc) and saves the output of each to a 
file. Then I'd parse those files up, and write a summary file.

Easy enough.  And I could certainly do it with by calling the tools one
at a time.  However, I'd like to get roughly 1 minute of vmstat, 
iostat, and sar output Simultaneously.  So I'm supposing I'd 
want to fork off each process, and then when those are all done 
come back and run a script that then parses those results out for 
the individual statistics I'm looking for.

I've never used fork before, and while it looks fairly straight forward
what I am not sure of is how to make sure all of those forked 
processes have completed before moving on and parsing the files.

Any pointers?

Thanks in advance

-Tony

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Timing several processes

2003-12-03 Thread Akens, Anthony
Hi all!

I'm wanting to write a simple web-based tool to see the status
of several servers at a glance.  I know there are many solutions
existing, but I can't learn as much about perl by just using one 
of those as I can by writing my own.  The first step I want to do 
is call a script from cron that runs several basic monitoring tools 
(sar, vmstat, df, iostat, etc) and saves the output of each to a 
file. Then I'd parse those files up, and write a summary file.

Easy enough.  And I could certainly do it with by calling the tools
one at a time.  However, I'd like to get roughly 1 minute of vmstat, 
iostat, and sar output Simultaneously.  So I'm supposing I'd 
want to fork off each process, and then when those are all done 
come back and run a script that then parses those results out for 
the individual statistics I'm looking for.

I've never used fork before, and while it looks fairly straight
forward what I am not sure of is how to make sure all of those forked 
processes have completed before moving on and parsing the files.

Any pointers?

Thanks in advance

-Tony

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: OFF TOPIC: Unix in a Nutshell Orielly 3rd edition

2003-11-28 Thread Akens, Anthony
I'm sorry, small correction.  The Unix CD Bookshelf does not contain
"Essential System Administration".

It has:

Unix Power Tools, 3rd Edition 
Learning the Unix Operating System, 5th Edition 
Learning the vi Editor, 6th Edition 
Mac OS X for Unix Geeks 
Learning the Korn Shell, 2nd Edition 
sed & awk, 2nd Edition 
Unix in a Nutshell, 3rd Edition

-Original Message-
From: Akens, Anthony 
Sent: Friday, November 28, 2003 2:43 PM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: OFF TOPIC: Unix in a Nutshell Orielly 3rd edition


Sorry to hear you're running SCO.  My sincerest sympathies.

As for Unix books, the O'Reilly "Unix in a Nutshell" is a good book.

I don't consider it quite as indispensible as "Essential System
Administration", 
but that book assumes you're a little more fluent in Unix.

So if you're a user on the system, I'd think Unix in a Nutshell would be
fine.  
If you're administering it you might want both books, or maybe the Unix
CD 
Bookshelf, which has both in it.

*Note: I'm not an O'Reilly salesman, I just really like their books.

Tony

-Original Message-
From: Paul Kraus [mailto:[EMAIL PROTECTED] 
Sent: Friday, November 28, 2003 2:01 PM
To: [EMAIL PROTECTED]
Subject: OFF TOPIC: Unix in a Nutshell Orielly 3rd edition


I need to beef up on my UNIX skills. Are major server is running Sco
Open server.

Will this book benefit me or is there another I should look at.

Not on topic and I apologize but beyond perl the list seems to have many
UNIX 
enthusiasts.

Paul


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: OFF TOPIC: Unix in a Nutshell Orielly 3rd edition

2003-11-28 Thread Akens, Anthony
Sorry to hear you're running SCO.  My sincerest sympathies.

As for Unix books, the O'Reilly "Unix in a Nutshell" is a good book.

I don't consider it quite as indispensible as "Essential System
Administration", 
but that book assumes you're a little more fluent in Unix.

So if you're a user on the system, I'd think Unix in a Nutshell would be
fine.  
If you're administering it you might want both books, or maybe the Unix
CD 
Bookshelf, which has both in it.

*Note: I'm not an O'Reilly salesman, I just really like their books.

Tony

-Original Message-
From: Paul Kraus [mailto:[EMAIL PROTECTED] 
Sent: Friday, November 28, 2003 2:01 PM
To: [EMAIL PROTECTED]
Subject: OFF TOPIC: Unix in a Nutshell Orielly 3rd edition


I need to beef up on my UNIX skills. Are major server is running Sco
Open server.

Will this book benefit me or is there another I should look at.

Not on topic and I apologize but beyond perl the list seems to have many
UNIX 
enthusiasts.

Paul


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Perl Vs ...

2003-09-22 Thread Akens, Anthony
I first learned C, then C++, which was a good route for me.  It gave
me a solid background in programming and structure that lends itself
well to learning other languages.  I've begun to pick up perl in order
to aid in system administration, though I've been told that Python is
a great tool for that task, also.

As many have said, it depends a lot on what you want to accomplish.
Look at what you want to do, and it will often dictate the road to
take to get there.


Tony

-Original Message-
From: Dan Anderson [mailto:[EMAIL PROTECTED]
Sent: Monday, September 22, 2003 10:52 AM
To: Paul Kraus
Cc: 'perl beginners'
Subject: Re: Perl Vs ...


> How do you go about deciding if you
> should use another tool such as C++ over perl? 

You basically look at the advantages and disadvantages of different
languages and decide which will be best for the task.  For instance, if
you wanted to write a quick script to generate a Template file and fill
in some values (possibly by querying the user) it would be a nightmare
to implement in C++ compared with Perl.

At the same time, try programming a game making use of 3D graphics and
hardware acceleration in Perl...  It's probably not something you would
want to do.  You would want to use C or C++. 

So it all depends on what you are interested in and what you want to
do.  Do you want to do web development?  You could write a CGI program
in C++, but probably wouldn't want to (Perl or PHP would be a better
choice).  Do you want to create games and programs that a user can
interact with from a web page?  Better check out Java or .NET -- because
a CGI script is done executing when a web page is displayed.

The list goes on and on.  C++ is a bad choice to use if you want
portability without headache, Java would be a better choice (although
this is not entirely true, Java can have its own quirks as well).

So look and see what you want to do.  And find a language based on that

-Dan


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: TK

2003-09-19 Thread Akens, Anthony
I think that TK is included in AS perl for windows.  I copied
Paul privately with a sample Perl TK script to try, so he'll know
if he has it.  (I was surprised to find that I did!)

Didn't want to copy it on to the list, it's a bit large (20k) to
flood the list with. (It's the game "Same Game" done in TK, I got
it from the perlmonks site)

Tony

-Original Message-
From: Paul Kraus [mailto:[EMAIL PROTECTED]
Sent: Friday, September 19, 2003 11:18 AM
To: 'Dan Anderson'; 'zentara'
Cc: [EMAIL PROTECTED]
Subject: RE: TK


First of I never even knew about safari until today :( Talk about money
I could have saved At any rate.

I can't seem to find tk via ppm for AS perl. Any suggestions on where to
get it? I can't really compile on w32. I am missing the c libraries and
a compiler. 

Am I able to write a network app say on one of my Linux servers? Then
provide a gui that would run on the w32 workstations? Do I have to
install perl / tk on all of the workstations to do this?

Paul

-Original Message-
From: Dan Anderson [mailto:[EMAIL PROTECTED] 
Sent: Friday, September 19, 2003 11:33 AM
To: zentara
Cc: [EMAIL PROTECTED]
Subject: Re: TK


> 1. Get the book Mastering Perl/Tk.  It is very good. You can get
> it on Safari if you are in a hurry or want an online version.

Get it on safari no matter what.  $15 a month for 10 books is so much
better then shelling out $50 a computer book every time you need to
learn some new tricks.  safari.oreilly.com <-- Check it out.

Heck, there's even a free 14 day trial, so you could try it and print
out up to 10 books and leave.  And that wouldn't be very hard if you
wrote a small script to parse the HTML and throw it into a file (i.e.
strip out the O'reilly tables).

-Dan


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Matching a section of test

2003-09-08 Thread Akens, Anthony
Thanks Bob, I'm going to file this away in
my nifty perl code file.

-Original Message-
From: Bob Showalter [mailto:[EMAIL PROTECTED]
Sent: Monday, September 08, 2003 10:29 AM
To: Akens, Anthony; [EMAIL PROTECTED]
Subject: RE: Matching a section of test


Akens, Anthony wrote:
> Sorry for the first post, didn't mean this as
> a reply.
> 
> 
> Hello all...
> 
> I'm wanting to write a script that scans a file,
> ignoring all lines until it reaches a certain
> section, then processes all lines in that
> section that are not comments, until it reaches
> the end of that section.
> 
> The section would be designated like this:
> 
> ## Beging Processing ##
> 
> ## End Processing ##
> 
> So I open my file, and skip all lines until I
> see that first bit...  Then do stuff until I
> see the last bit.  Can someone help me out with
> this?
> 
> Tony
> 
> 
> #!/usr/bin/perl -w
> 
> use strict;
> 
> open (FILE, " or die "Could not open Template. ($!)";
> 
> while ($line = ) {
>   #skip until beginning
> 
>   #End when "End Processing" is reached
>   last if $line =~ "End Processing";
> 
>   #Process the lines in between
>   next if $line =~ /^#/;
>   #do stuff
> }

The range operator in scalar context can handle this kind of thing. See
perldoc perlop and search for "Range Operators".

   while () {
  if (/^## Begin/ .. /^## End/) {
 ...do stuff here with $_
  }
   }

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Matching a section of test

2003-09-08 Thread Akens, Anthony
Sorry for the first post, didn't mean this as
a reply.


Hello all...

I'm wanting to write a script that scans a file,
ignoring all lines until it reaches a certain
section, then processes all lines in that
section that are not comments, until it reaches
the end of that section.

The section would be designated like this:

## Beging Processing ##

## End Processing ##

So I open my file, and skip all lines until I
see that first bit...  Then do stuff until I
see the last bit.  Can someone help me out with 
this?

Tony


#!/usr/bin/perl -w

use strict;

open (FILE, ") {
#skip until beginning

#End when "End Processing" is reached
last if $line =~ "End Processing";

#Process the lines in between
next if $line =~ /^#/;
#do stuff
}

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Matching quoted text question...

2003-09-08 Thread Akens, Anthony
Hello all...

I'm wanting to write a script that scans a file,
ignoring all lines until it reaches a certain
section, then processes all lines in that
section that are not comments, until it reaches
the end of that section.

The section would be designated like this:

## Beging Processing ##

## End Processing ##

So I open my file, and skip all lines until I
see that first bit...  Then do stuff until I
see the last bit.  Can someone help me out with 
this?

Tony


#!/usr/bin/perl -w

use strict;

open (FILE, ") {
#skip until beginning

#End when "End Processing" is reached
last if $line =~ "End Processing";

#Process the lines in between
next if $line =~ /^#/;
#do stuff
}

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Thanks and another quick Q, how to unconcatenate...

2003-09-04 Thread Akens, Anthony
Something like this will skip all files with _nice at the end...

   next if $file =~ /_nice$/;
   unlink ($file) or die "Couldn't delete file;


(I think that would work.  Untested)

Tony


-Original Message-
From: LoneWolf [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 04, 2003 2:23 PM
To: [EMAIL PROTECTED]
Subject: Thanks and another quick Q, how to unconcatenate...


Thanks for everyone's help with this one, I was stuck and knew I was missing
something simple..  UGH.  perldoc -q replace didn't turn me up with anything
either, which was a bummer, but going off the code posted here I was able to
do more with it.

This is what I used:

sub cleanup{

use strict;

my $dirname = "/home/data";
my $file;
my $newfile;
my $line;

opendir (DIR, $dirname) or die "Can't opendir $dirname: $!"; while
(defined($file = readdir(DIR))) {
next if $file =~ /^\.\.?$/;
open (OLDFILE, "< $file");
$newfile = $file . "_nice";
open (NEWFILE, "> $newfile");
while ($line = )  {
#$line = $line =~ /^\s*(.*)\s*\n$/;
$line =~ s/\s+/ /g;
$line =~ s/^ //;
$line =~ s/ $//;
$line =~ s/\t/|/g;
print NEWFILE "$line\n";
}
close OLDFILE;
close NEWFILE;
  }
}
--

I created a cleanup_dir subscript as well to handle removing the old files
but don't remember how to Unconcatenate the files:

---
sub cleanup_dir {
use strict;

my $dirname = "/home/data";
my $file;

opendir (DIR, $dirname) or die "Can't opendir $dirname: $!"; while
(defined($file = readdir(DIR))) {
 system 'chown', 'robert', '$file';
 system 'chgrp', 'GCN', '$file';
 }
 #remove all the files that are not _nice?

 system 'rm', '-f' , 's4.idx', 's4.idx_nice'; #removes unneeded files.
}
---




>On Thu, 04 Sep 2003 11:31:52 -0500 "Perry, Alan" <[EMAIL PROTECTED]>
wrote.
>On Thursday, September 04, 2003 11:11, Marshall, Stephen wrote:
>>
>>Got it working this way fror the important line, but theres probably a
>slicker way of doing it.
>>
>>$line =~ s/(\s) / /g;
>>
>
>This will work, but may leave an extraneous space at the beginning and/or
>end of the line.
>
>This text:
>
>"   Test  text   with  lots of   extra spaces   "
>
>would get changed to:
>
>" Test text with lots of extra spaces "
>
>which may not be what you want.
>
>If you want to eliminate any starting or ending space and trim the rest of
>it down to single spaces, I would suggest this:
>
>$line =~ s/\s / /g;  # the parens you had are not necessary
>$line =~ s/^ //;  # removes any space from the beginning of the line
>$line =~ s/ $//;  # removes any space from the end of the line
>
>You could probably get fancier on the statements, but I prefer the
>simplicity of three separate statements.
>
>HTH,
>
>Alan
>
>-- 
>To unsubscribe, e-mail: [EMAIL PROTECTED]
>For additional commands, e-mail: [EMAIL PROTECTED]
>


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Getting rid of white space...

2003-09-04 Thread Akens, Anthony
Try it like this...  I was a slacker and didn't have warnings
in place if it couldn't open the file...  Should be able
to just change the "/my/stuff/" to your directory and have it
work, I tested it here, and it seems to go fine.  (on an aix
box - maybe I have something in place that doesn't work on
windows or other platforms?)

Tony

#!/usr/bin/perl -w

use strict;
my $dirname = "/my/stuff/";
my $file;
my $newfile;
my $line;

opendir (DIR, $dirname) or die "Can't opendir $dirname: $!";

while (defined($file = readdir(DIR))) {
next if $file =~ /^\.\.?$/;
open (OLDFILE, "< $file") or die "Can't open $file: $!";
$newfile = $file . "_nice";
open (NEWFILE, "> $newfile") or die "Can't open $newfile: $!";
while ($line = )  {
($line) = ($line =~ /^\s*(.*)\s*\n$/);
print NEWFILE "$line\n";
}
close OLDFILE;
close NEWFILE;
}

-Original Message-----
From: Marshall, Stephen [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 04, 2003 11:11 AM
To: Marshall, Stephen; Akens, Anthony; '[EMAIL PROTECTED]';
'[EMAIL PROTECTED]'; '[EMAIL PROTECTED]'
Subject: RE: Getting rid of white space...


Got it working this way fror the important line, but theres probably a slicker way of 
doing it.

$line =~ s/(\s)+/ /g;

> -Original Message-
> From: Marshall, Stephen 
> Sent: 04 September 2003 17:07
> To: 'Akens, Anthony'; [EMAIL PROTECTED]; 
> [EMAIL PROTECTED]; [EMAIL PROTECTED]
> Subject: RE: Getting rid of white space...
> 
> 
> I have a similar problem to this , does anyone have another 
> answer?  Don't know if its just me but The 
> 
> > use strict;
> > while(my $line = <>) {
> > ($line) = ($line =~ /^\s*(.*)\s*\n$/);
> > print($line."_nice/n");
> > }
> 
> Code doesn't work , and the enhanced version with the fancy 
> file handling doesn't work either "trying to read from a 
> closed filehandle error"
> 
> Stephen 
> 
> > -Original Message-
> > From: Akens, Anthony [mailto:[EMAIL PROTECTED]
> > Sent: 04 September 2003 15:55
> > To: [EMAIL PROTECTED]; [EMAIL PROTECTED]; 
> > [EMAIL PROTECTED]
> > Subject: RE: Getting rid of white space...
> > 
> > 
> > I figured I'd take a stab at fleshing this out into what he
> > wants... Any comments on things I could do better?  I only 
> > added to what Robert had coded...
> > 
> > Tony
> > 
> > 
> > 
> > 
> > #!/usr/bin/perl -w
> > 
> > use strict;
> > my $dirname = "/my/stuff/";
> > my $file;
> > my $newfile;
> > my $line;
> > 
> > opendir (DIR, $dirname) or die "Can't opendir $dirname: $!";
> > while (defined($file = readdir(DIR))) {
> > next if $file =~ /^\.\.?$/;
> > open (OLDFILE, "< $file");
> > $newfile = $file . "_nice";
> > open (NEWFILE, "> $newfile");
> > while ($line = )  {
> > ($line) = ($line =~ /^\s*(.*)\s*\n$/);
> > print NEWFILE "$line\n";
> > }
> > close OLDFILE;
> > close NEWFILE;
> > }
> > 
> > 
> > -Original Message-
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED]
> > Sent: Thursday, September 04, 2003 8:32 AM
> > To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
> > Subject: RE: Getting rid of white space...
> > 
> > 
> > are you speaking of this?
> > 
> > use strict;
> > while(my $line = <>) {
> > ($line) = ($line =~ /^\s*(.*)\s*\n$/);
> > print($line."_nice/n");
> > }
> > 
> > -Original Message-
> > From: LoneWolf [mailto:[EMAIL PROTECTED]
> > Sent: Thursday, September 04, 2003 3:38 PM
> > To: [EMAIL PROTECTED]
> > Subject: Getting rid of white space...
> > 
> > 
> > I have about 12 files that I am pulling for a SCO box to a
> > RedHat box, FTP. 
> > THe files from the SCO box are poorly formatted with 
> > extraneous whitespace (sometimes as much as 30 or more) 
> > before and after the text.  I need to parse all of the files 
> > I DL and put them into a new file with "_nice" added at the end.
> > 
> > The files are all pipe-delimited, so I don't have a problem
> > separating the fields, I just am not sure how to make it 
> > remove all extra whitespace.  It
> > needs to keep all Space in the fields "the
> > description   of   the
> >  file" should still be readable as "the description 
> > of the file"
> > 
> > Any help with code examples?  I have been looking through a
> > beginning book and my old code and have come up nil.
> > 
> > Thanks,
> > Robert
> > 
> > --
> > To unsubscribe, e-mail: [EMAIL PROTECTED]
> > For additional commands, e-mail: [EMAIL PROTECTED]
> > 
> > --
> > To unsubscribe, e-mail: [EMAIL PROTECTED]
> > For additional commands, e-mail: [EMAIL PROTECTED]
> > 
> > 
> > --
> > To unsubscribe, e-mail: [EMAIL PROTECTED]
> > For additional commands, e-mail: [EMAIL PROTECTED]
> > 
> 

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Getting rid of white space...

2003-09-04 Thread Akens, Anthony
I figured I'd take a stab at fleshing this out into what he wants...
Any comments on things I could do better?  I only added to what
Robert had coded...

Tony




#!/usr/bin/perl -w

use strict;
my $dirname = "/my/stuff/";
my $file;
my $newfile;
my $line;

opendir (DIR, $dirname) or die "Can't opendir $dirname: $!";
while (defined($file = readdir(DIR))) {
next if $file =~ /^\.\.?$/;
open (OLDFILE, "< $file");
$newfile = $file . "_nice";
open (NEWFILE, "> $newfile");
while ($line = )  {
($line) = ($line =~ /^\s*(.*)\s*\n$/);
print NEWFILE "$line\n";
}
close OLDFILE;
close NEWFILE;
}


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 04, 2003 8:32 AM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: Getting rid of white space...


are you speaking of this?

use strict;
while(my $line = <>) {
($line) = ($line =~ /^\s*(.*)\s*\n$/);
print($line."_nice/n");
}

-Original Message-
From: LoneWolf [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 04, 2003 3:38 PM
To: [EMAIL PROTECTED]
Subject: Getting rid of white space...


I have about 12 files that I am pulling for a SCO box to a RedHat box, FTP. 
THe files from the SCO box are poorly formatted with extraneous whitespace
(sometimes as much as 30 or more) before and after the text.  I need to
parse all of the files I DL and put them into a new file with "_nice" added
at the end.

The files are all pipe-delimited, so I don't have a problem separating the
fields, I just am not sure how to make it remove all extra whitespace.  It
needs to keep all Space in the fields "thedescription   of   the
 file" should still be readable as "the description of the file"

Any help with code examples?  I have been looking through a beginning book
and my old code and have come up nil.

Thanks,
Robert

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Getting rid of white space...

2003-09-04 Thread Akens, Anthony
I figured I'd take a stab at fleshing this out into what he wants...
Any comments on things I could do better?  I only added to what
Robert had coded...

Tony




#!/usr/bin/perl -w

use strict;
my $dirname = "/my/stuff/";
my $file;
my $newfile;
my $line;

opendir (DIR, $dirname) or die "Can't opendir $dirname: $!";
while (defined($file = readdir(DIR))) {
next if $file =~ /^\.\.?$/;
open (OLDFILE, "< $file");
$newfile = $file . "_nice";
open (NEWFILE, "> $newfile");
while ($line = )  {
($line) = ($line =~ /^\s*(.*)\s*\n$/);
print NEWFILE "$line\n";
}
close OLDFILE;
close NEWFILE;
}



-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 04, 2003 8:32 AM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: Getting rid of white space...


are you speaking of this?

use strict;
while(my $line = <>) {
($line) = ($line =~ /^\s*(.*)\s*\n$/);
print($line."_nice/n");
}

-Original Message-
From: LoneWolf [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 04, 2003 3:38 PM
To: [EMAIL PROTECTED]
Subject: Getting rid of white space...


I have about 12 files that I am pulling for a SCO box to a RedHat box, FTP. 
THe files from the SCO box are poorly formatted with extraneous whitespace
(sometimes as much as 30 or more) before and after the text.  I need to
parse all of the files I DL and put them into a new file with "_nice" added
at the end.

The files are all pipe-delimited, so I don't have a problem separating the
fields, I just am not sure how to make it remove all extra whitespace.  It
needs to keep all Space in the fields "thedescription   of   the
 file" should still be readable as "the description of the file"

Any help with code examples?  I have been looking through a beginning book
and my old code and have come up nil.

Thanks,
Robert

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Upload local file to server

2003-08-29 Thread Akens, Anthony
FTP, NFS, Windows Shares, rcp, scp

It's a pretty open-ended question, depends on what OS
the server is running, what services, etc.

Tony

-Original Message-
From: Whippo, Ryan K [mailto:[EMAIL PROTECTED]
Sent: Friday, August 29, 2003 11:13 AM
To: [EMAIL PROTECTED]
Subject: Upload local file to server


Is there an easy way to upload a local file to a server?


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Graphing/Plotting over time

2003-08-27 Thread Akens, Anthony
Hello all,

Just wanted to look into a "for fun" project, after a
recent project that wasn't much fun at all...  Our
organization got hit by the blaster worm, which hit
many, many windows boxes.  The *nix boxes (which I
manage) were of course unaffected, except by the
total lack of bandwidth available to them.  Except for
one.  We have the syslog on our PIX firewall forward
on to one of my boxes, so I have an interested detailed
log of how the blaster worm spread on our network.

So much for the history, now on to some ideas...  I
thought it would be interesting to plot two things -
1) How many hits per minute, and 2) Total compromised
systems over time.

I thought of perl immediately as a good tool to break
this rather large file down, but being a newbie I'm
not sure how to begin.  The format of each line is
as follows (IPs changed to protect the lazy):

Aug 20 16:57:28 pix %PIX-3-106011: Deny inbound (No xlate) 
icmp src inside:10.0.0.10 dst inside:10.1.1.23 (type 8, code 0)


For the first bit I know I would need to just create a counter
for each minute, probably using a regex to increment the counter?

For the second I would need to count the source machine IPs, and
use a hash(?) to keep track of them, and when each first appears
in the logs, then plot that over time?

Can anyone give me some ideas where to start?  This worm spread
incredibly fast in our network, should be interesting to see it
charted.

Tony

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: File move

2003-07-02 Thread Akens, Anthony
Actually Rob your tips are very handy...

I have a nasty habit of posting only part of the code when
asking questions here, gotta break that habit.

>  use strict;   # always
>  use warnings; # usually
Doing those :)

> - What's this 'move' thing? Have you sneakily added 'use File::Copy without
> telling us? If so, then
You guessed it

> - Do you want existing files of the same name overwritten?
Yes

So now my code looks like this:
use strict;
use warnings;
use File::Copy;

my $reportsdir = '/usr2/reports';
my $oldreportsdir = '/usr2/oldreports';

die "No destination directory" unless -d $oldreportsdir;
opendir(DIR, $reportsdir) or die "can't opendir $reportsdir: $!";
my @files = grep -f , readdir(DIR);
foreach (@files) {
move("$reportsdir/$file", $oldreportsdir)
or die "move failed: $!";
}
closedir(DIR);

Thanks for the help - mainly I want to make sure my code is
"sane" and refined.  Your reply was very useful.

Tony

-Original Message-
From: Rob Dixon [mailto:[EMAIL PROTECTED]
Sent: Wednesday, July 02, 2003 11:57 AM
To: [EMAIL PROTECTED]
Subject: Re: File move


Hi Anthony.

Anthony Akens wrote:
> Just want to check and make sure this snippet of code will do what I think it will.
>
> Trying to copy all files from $reportsdir to $oldreportsdir

Well you shouldn't be asking us, as we can make guesses - usually
Very Good Guesses - as to whether it will work, but we are not your
computer and it may not agree with us.

  use strict;   # always
  use warnings; # usually

> my $reportsdir = '/usr2/reports';
> my $oldreportsdir = '/usr2/oldreports';
>
> # Move everything from the report directory to the old report directory
> opendir(DIR, $reportsdir) or die "can't opendir $reportsdir: $!";

- You're checking that the 'opendir' works on $reportsdir. How about checking
that $oldreportsdir exists?

  die "No destination directory" unless -d $oldreportsdir;

- You may prefer to

  chdir $reportsdir;

first, so that you don't have to assemble the fully-qualified source file name.

> while (defined($file = readdir(DIR))) {
> move("$reportsdir/$file", "$oldreportsdir/$file")

- 'readdir' will give you directories as well as plain files. I suggest

  my @files = grep -f, readdir(DIR);
  foreach (@files) {
:
  }

- What's this 'move' thing? Have you sneakily added 'use File::Copy without
telling us? If so, then

  move("$reportsdir/$file", $oldreportsdir);

is clearer.

- Do you want existing files of the same name overwritten? If not you need
to check first whether the destination file exists.

- Try your program first with 'copy' instead of 'move'. Then if it doesn't work
you shouldn't have done any damage.

> or die "move failed: $!";
> }
>

Hmm. That's probably not the answer you were hoping for!

Even so HTH,

Rob




-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



File move

2003-07-02 Thread Akens, Anthony
Just want to check and make sure this snippet of code will do what I think it will.

Trying to copy all files from $reportsdir to $oldreportsdir

my $reportsdir = '/usr2/reports';
my $oldreportsdir = '/usr2/oldreports';

# Move everything from the report directory to the old report directory
opendir(DIR, $reportsdir) or die "can't opendir $reportsdir: $!";
while (defined($file = readdir(DIR))) {
move("$reportsdir/$file", "$oldreportsdir/$file")
or die "move failed: $!";
}

Tony

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Formatted info in a web page

2003-07-01 Thread Akens, Anthony
Hi all,
I'm trying to display some (hopefuly) nicely formatted 
information in a web page. The data is pulled from a 
comma seperated text file.  Preferably the data would 
be scrollable (don't want the page itself to go on and 
on, so need it displayed in a table or form) 
searchable, and ideally be able to have form buttons 
associated with it (so the functionality could be added 
to remove the item from the data file, or update 
portions of the data such as description, etc).  

The data is written by a form on the same page, so the 
format is one I control.  Here's a rather basic way I'm 
doing it now using a form, any suggestions on how to 
improve the look, or add the ability to edit/remove 
items?  (note that each item is actually representative 
of a file that gets copied to several servers, so any 
sort of remove would be a seperate script to handle 
removing those files from each server).

(this is just a snip of the code, to avoid making
people read through the entire page, which is just
a form to generate the data that goes into this
text file)

Thanks for any tips...

Tony


open (RECORD, "\n";

print "\n";
printf ("%-30s %-10s %-10s %-5s %-5s %-5s %-5s", "Client", "Config", "TTYID", "PORT", 
"CRT", "DEPT", "DESCRIPTION");
print "<\/TEXTAREA>\n";

print "\n";

my @array = ;
my $line;

foreach $line (sort @array) {
my ($ws, $cfg, $tty, $port, $crt, $dept, $desc) = split (',',$line);
printf ("%-30s %-10s %-10s %-5s %-5s %-5s %-5s", $ws, $cfg, $tty, $port, $crt, $dept, 
$desc);
}

print "<\/TEXTAREA>\n";
print "<\/FORM>\n";

close RECORD;

--Data - pcview.clients--
egh-crisser.pcv,DEF,PCJCI,814,PCJ,CI,Connie Risser
egh-cnowlin1.pcv,DEF,ERFCI,277,ERF,CI,Colleen Nowlin
egh-swhite_3.pcv,DEF,OHECI,631,OHE,CI,Shelley White
egh-aakens.pcv,NOPA,AJAIS,000,AJA,IS,Tony Akens
egh-twoods.pcv,DEF,TRWIS,111,TRW,IS,Todd Woods
egh-dholloway.pcv,NOPA,ERQED,834,ERQ,ED,ED - Mgr Office
egh-emergency.pcv,NOPA,ERGED,360,ERG,ED,ED - RN Desk 2
egh-emergency3.pcv,NOPA,ERMED,832,ERM,ED,Phy Dictation Desk
egh-emergency_2.pcv,NOPA,ERKED,683,ERK,ED,ED - RN Desk 1
egh-er-sdoc1.pcv,NOPA,EROED,833,ERO,ED,Phy Desk 2
egh-jvanputten.pcv,NOPA,ERPED,147,ERP,ED,Jeanne VanPutten
egh-nwebb.pcv,NOPA,ADPED,233,ADP,ED,Nora Webb
egh-radtechctr.pcv,NOPA,626ED,626,626,ED,RAD ER Tech Cntr
egh-superdoc1.pcv,NOPA,ERNED,831,ERN,ED,Phy Desk 1
eghtc-ed2.pcv,NOPA,ERDED,30,ERD,ED,ED - Sec Orders
eghtc-triage1.pcv,NOPA,ERBED,28,ERB,ED,ED - Triage 1

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Strange behaviour of chdir in mapped drives

2003-06-30 Thread Akens, Anthony
If you use the unc path, most of these problems dissapear.  Also, the 
account you're running the script as needs permission to access the
share in question.

Using mapped drive letters in a scheduled task probably will not work.

Tony

-Original Message-
From: Smith Jeff D [mailto:[EMAIL PROTECTED]
Sent: Monday, June 30, 2003 9:47 AM
To: 'Tim Johnson'; Akens, Anthony; [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: Strange behaviour of chdir in mapped drives


On the last point, does this happen even when using the Task Scheduler
utility and running the script as a specified account?  I'm working on a
script that needs to run periodically throughout the day and intend to use
the WinNT Task Scheduler--the script appears to run fine so far from the DOS
command line.  Are there special instructions for running it under Task
Scheduler as a particular account?  Thanks

-Original Message-
From: Tim Johnson [mailto:[EMAIL PROTECTED] 
Sent: Monday, June 30, 2003 10:26 AM
To: 'Akens, Anthony'; [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: Strange behaviour of chdir in mapped drives



That's exactly right.  You will run into the same problem if you try running
your scripts in the scheduler.

-----Original Message-
From: Akens, Anthony [mailto:[EMAIL PROTECTED]
Sent: Monday, June 30, 2003 7:12 AM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: Strange behaviour of chdir in mapped drives


Is the drive you're trying to access actually mapped for the user you're
running the cgi-bin as?

Remember, in windows drive mappings are a per-user thing,
so while yes it will work when you run it at a command line it's because
you're logged in and have the drive mapped.  When you're running it through
a web browser as a cgi-bin program the drive has to be mapped.  I think if I
remember right it actually needs to be mapped for the user the web server is
running as, though it's been a while.  It would probably be much easier for
you to use the unc path.

\\server\share

Tony

-Original Message-
From: beginner beginner [mailto:[EMAIL PROTECTED]
Sent: Monday, June 30, 2003 8:57 AM
To: [EMAIL PROTECTED]
Subject: Re: Strange behaviour of chdir in mapped drives


Hi,
   I have tried it through command line It is working When i used 
the same code and put the file  in cgi-bin and try to run through 
explorer it didn't worked

$basedir= "I:/tech/work/web/Documents";
chdir($basedir) or die $!;
$test=`dir /s /b *.*`;
print $test;

If I simply replace I: with D:  It starts working. Please help.

Thanks,
Amit

--- "Jenda Krynicky" <[EMAIL PROTECTED]> wrote:
>From: "Rob Dixon" <[EMAIL PROTECTED]>
>> Beginner Beginner wrote:
>> > Hi All,
>> > I wanted to search for *.html file on Server which I can
>> > mount in my Windows XP:
>> > e.g. I drive mapped to tech\work\web\documents.
>> > as I:\tech\work\web\documents
>> > now i try to go inside this directory
>> > as
>> > $basedir="I:/tech/work/web/documents";
>> > chdir($basedir);
>> >
>> > but this is not working I have tried the same with my local drives 
>> > and it was working fine Don't know why May be you guys can help me 
>> > out to solve my problem.
>> 
>> Hi Mr B.
>> 
>> How do you know this isn't working?
>> 
>> Perl will give you a reason for failure if you try
>> 
>>   chdir $basedir or die $!;
>
>chdir $basedir or die $^E;
>
>might give you more info.
>
>Jenda
>= [EMAIL PROTECTED] === http://Jenda.Krynicky.cz = When it 
>comes to wine, women and song, wizards are allowed to get drunk and 
>croon as much as they like.
>   -- Terry Pratchett in Sourcery
>
>
>--
>To unsubscribe, e-mail: [EMAIL PROTECTED]
>For additional commands, e-mail: [EMAIL PROTECTED]

_
Get Your Private, Free Jatt Email at http://www.jatt.com/

_
Select your own custom email address for FREE! Get [EMAIL PROTECTED], No
Ads, 6MB, IMAP, POP, SMTP & more!
http://www.everyone.net/selectmail?campaign=tag

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Strange behaviour of chdir in mapped drives

2003-06-30 Thread Akens, Anthony
Is the drive you're trying to access actually mapped for the
user you're running the cgi-bin as?

Remember, in windows drive mappings are a per-user thing,
so while yes it will work when you run it at a command line
it's because you're logged in and have the drive mapped.  When
you're running it through a web browser as a cgi-bin program
the drive has to be mapped.  I think if I remember right it
actually needs to be mapped for the user the web server is
running as, though it's been a while.  It would probably
be much easier for you to use the unc path.

\\server\share

Tony

-Original Message-
From: beginner beginner [mailto:[EMAIL PROTECTED]
Sent: Monday, June 30, 2003 8:57 AM
To: [EMAIL PROTECTED]
Subject: Re: Strange behaviour of chdir in mapped drives


Hi,
   I have tried it through command line It is working When i used 
the same code and put the file  in cgi-bin and try to run through 
explorer it didn't worked

$basedir= "I:/tech/work/web/Documents";
chdir($basedir) or die $!;
$test=`dir /s /b *.*`;
print $test;

If I simply replace I: with D:  It starts working. Please help.

Thanks,
Amit

--- "Jenda Krynicky" <[EMAIL PROTECTED]> wrote:
>From: "Rob Dixon" <[EMAIL PROTECTED]>
>> Beginner Beginner wrote:
>> > Hi All,
>> > I wanted to search for *.html file on Server which I can
>> > mount in my Windows XP:
>> > e.g. I drive mapped to tech\work\web\documents.
>> > as I:\tech\work\web\documents
>> > now i try to go inside this directory
>> > as
>> > $basedir="I:/tech/work/web/documents";
>> > chdir($basedir);
>> >
>> > but this is not working I have tried the same with my local drives
>> > and it was working fine Don't know why May be you guys can help me
>> > out to solve my problem.
>> 
>> Hi Mr B.
>> 
>> How do you know this isn't working?
>> 
>> Perl will give you a reason for failure if you try
>> 
>>   chdir $basedir or die $!;
>
>chdir $basedir or die $^E;
>
>might give you more info.
>
>Jenda
>= [EMAIL PROTECTED] === http://Jenda.Krynicky.cz =
>When it comes to wine, women and song, wizards are allowed 
>to get drunk and croon as much as they like.
>   -- Terry Pratchett in Sourcery
>
>
>-- 
>To unsubscribe, e-mail: [EMAIL PROTECTED]
>For additional commands, e-mail: [EMAIL PROTECTED]

_
Get Your Private, Free Jatt Email at http://www.jatt.com/

_
Select your own custom email address for FREE! Get [EMAIL PROTECTED], No Ads, 6MB, 
IMAP, POP, SMTP & more! http://www.everyone.net/selectmail?campaign=tag

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Binary Replace

2003-06-06 Thread Akens, Anthony


>Anthony Akens wrote:
>> Hi all,
>> I'm doing a text replace in a binary file, which works fine as long as the
>> text I replace it with is the exact same length.  If the text I put in is longer or
>> shorter, the program that reads the file (not one I wrote) chokes and spews
>> out a bunch of garbage.  Is there a way in perl to deal with that?

>Not really, unless the program you didn't write is in Perl.
Nope.  Probably C.

>> This is for a config file, so what I'm doing is having the user select which
>> generic config template to use, and inputting their "id" and it generates a
>> config file with their ID in it.
>
>> The string I'm inputting is always going to be 5, 6, or 7 characters long.  The
>> only work around I've found for this is to make a config file each with a default
>> string of the appropriate length.  The problem is that I'm ending up needing
>> three times as many config files as it seems I need to.

>That sounds about right. I guess you're lucky there isn't a checksum as
>well or you wouldn't be able to change anything at all.

>> The code I'm using for a file with a 7 character ID is:

>  use strict;
>  use warnings;

I left out a lot of the code, use strict and warnings are up there, yeah :)


>> open (TEMPLATE, "<$template") or die "Could not open Template. ($!)";
>> binmode (TEMPLATE);
>
>> open (NEW, ">$newfile") or die "Could not open file $dws ($!)";

>What's $dws?

Me forgetting to change the variable name.  Oops.

>> binmode (NEW);
>> while () {
>>
>>   s/REPLACE/$id/;
>>   print NEW $_;
>> }
>
>> close NEW;
>> close TEMPLATE;

>Yes, that'll do it, but how do you generate these different
>config files in the first place? It sounds like you need to do
>it that way rather than hack a generic file. There's no way of
>telling what the program expects without the source code.

I generate the config "templates" by going in to the app and
configuring the settings I want, then copying out that config
file.  The reason I'm trying to "automate" this is that creating
a config file within the application takes ~10 minutes, and there
are over 1500 of them to make, that can change on a regular basis.

I only need around 10 templates, though.

There is no way I can see the source, however I am 100% sure a
template done the way I am now works, as long as the orignal
and new templates contain strings of the same lengths.


>Rob

Thanks for the help.

-Tony




--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Binary Replace

2003-06-06 Thread Akens, Anthony
Hi all,
I'm doing a text replace in a binary file, which works fine as long as the
text I replace it with is the exact same length.  If the text I put in is longer or
shorter, the program that reads the file (not one I wrote) chokes and spews
out a bunch of garbage.  Is there a way in perl to deal with that?

This is for a config file, so what I'm doing is having the user select which
generic config template to use, and inputting their "id" and it generates a
config file with their ID in it.

The string I'm inputting is always going to be 5, 6, or 7 characters long.  The
only work around I've found for this is to make a config file each with a default
string of the appropriate length.  The problem is that I'm ending up needing
three times as many config files as it seems I need to.

The code I'm using for a file with a 7 character ID is:

open (TEMPLATE, "<$template") or die "Could not open Template. ($!)";
binmode (TEMPLATE);

open (NEW, ">$newfile") or die "Could not open file $dws ($!)";
binmode (NEW);
while () {

  s/REPLACE/$id/;
  print NEW $_;
}

close NEW;
close TEMPLATE;

-Tony

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: How do I check disk space through Perl?

2003-05-29 Thread Akens, Anthony
I've found using Win32::AdminMisc pretty handy, specifically the
GetDriveSpace($drive) funtion.

Look here: http://www.roth.net/perl/adminmisc/

-Tony

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Sent: Tuesday, May 27, 2003 12:05 PM
To: [EMAIL PROTECTED]
Subject: How do I check disk space through Perl?


Hi All,

It seems like I should know this, but I don't, and I can't seem to find it
written anywhere.
I need to check disk space on an NT platform using ActiveState Perl 5.6.1.
Is there an easy way to do this?

Thanks,
Peter



** CONFIDENTIALITY NOTICE **
NOTICE:  This e-mail message and all attachments transmitted with it may
contain legally privileged and confidential information intended solely for
the use of the addressee.  If the reader of this message is not the
intended recipient, you are hereby notified that any reading,
dissemination, distribution, copying, or other use of this message or its
attachments is strictly prohibited.  If you have received this message in
error, please notify the sender immediately and delete this message from
your system.  Thank you.




-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: Substring and Sort

2002-10-21 Thread Akens, Anthony
Thank you very much for your help, that took care of 
the issue.

The following section of code seems to work fine,
but I'd welcome any criticisms you may have, as
this is my first large perl program, and I'm sure
it could use some refining.

This section of code reads in the same file format

directory:displayname

It should print the first name of the displayname
if it's the first beginning with that character,
and should skip any displaynames that are equal
to MISSING or HIDE.  Eventually I plan on making
it possible to categorize each directory by a type,
and show/hide them based on a user's type selection.

As I said, the program is actually pretty funtional
as is, and currently is in use - but I'm always
willing to see how I can improve on what I've done.

Tony

-
open (NAMES, $namefile)
or print "Could not open $namefile $!";

while()
{
($key, $value) = split /:/;
chomp $value;
$Names{$key} = $value;
}
close NAMES;

%seen=();
print "";

foreach $dirname (sort { "\L$Names{$a}" cmp "\L$Names{$b}" } keys %Names){
$displayname = $Names{$dirname};
if (($displayname eq "MISSING") or ($displayname eq "HIDE")){}
else{

$item = substr($displayname, 0, 1);
if ($seen{"\L$item"}++){}
else {
print "\U$item<\/a>\n";
}
print "";
print $displayname;
print "<\/a>\n";
}
}
---------

-Original Message-
From: Larry Coffin [mailto:lc2002@;PointInfinity.com]
Sent: Monday, October 21, 2002 2:53 PM
To: Akens, Anthony; [EMAIL PROTECTED]
Subject: Re: Substring and Sort


At 3:03 PM -0400 10/21/02, Akens, Anthony wrote:
>I'm attempting to use the following code to read a file
>in the format of:
>
>directory name:displayname
>
>I want to sort the list by the "displayname", looking
>at the first letter for each display name in the file.
>If it's unique, I want to print it.  This should result
>in an alphabetical list of the letters for the display
>names.  However, the sort is not working quite right,
>and I sometimes get letters out of order, missing, or
>some that aren't even in the list.
>
>Can anyone tell me what I've done wrong?

First of all, it looks like you are sorting based on directory
name, not on the "displayname", since you are sorting on the keys and the
keys are the first part of the 'split /:/'.

The second thing, is that you are only saving one value per
'directory name' -- if there is more than one file (assuming 'displayname'
is the name of a file) in a directory, then you will only get the last file
because your '$Names{$key} = $value;' is overwriting the previous value.

If all you want is the list of unique letters that all the files
start with, then you'd probably be better off with something much simpler,
such as:

---

while () {
($dir, $file) = split /:/;
$letter = uc(substr($file, 0, 1));
$letters{$letter} = 1; # or $letters{$letter}++; to track the number
}

print "";

foreach $letter (sort keys %letters) {
print "$letter<\/a> ";
}

---


If you want to retain the list of files and the directory they are
in and have them sorted by the displayname, then I'd do something like this
(this assumes that the file names are case sensitive so we need to retain
the case but we want the sort to be case insensitive):

---

while () {
($dir, $file) = split /:/;
$letter = uc(substr($file, 0, 1));
push(@{$files{$letter}}, [lc($file), $file, $dir])
}

#
# %files is now a hash array with keys that are the uppercase first letter
# of the file names and the values are array references
#
# Each array reference contains array references which contain
# the file name in lower case, the file name in the original case,
# and the dir name in the [0] and [1] positions
#

print "";

# print out the index
foreach $letter (sort keys %files) {
print "$letter<\/a> ";
}

# print out the file list
foreach $letter (sort keys %files) {
print "$letter\n";

foreach $file_ref (sort {$a->[0] cmp $b->[0]} @{$files{$letter}}) {
print "$file_ref->[2]:$file_ref->[1]\n";
}
}

---

Note that the conversion to upper or lower case only occurs once
when we are saving the file info. If you do the conversion in the sort
block, perl may end up doi

Reset Passwords

2002-10-13 Thread Akens, Anthony

I need something that will allow people who do not know the root 
password to reset passwords on a DG/UX system (that is using 
shadow passwords).

Basically what I'm wanting to do is write something that will be 
used as a specific user, so when you log on as that user it asks 
what account you would like to reset the password for, makes sure 
it is not root, then assigns a default password such as abc123 
that must be changed at the next login.

Any tips on how to go about writing something like this? 

A sample session should look like this:

login: resetpassword
password: 
What account would you like to reset? 
$users account reset, must be changed at next login.
exit 


Tony Akens

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Is there a way to do cron on windows?

2002-08-20 Thread Akens, Anthony

Microsoft also has a product called "Unix tools for Windows" which has
cron built into it.  I run it on a few machines, and it works fairly
well. (has an nfs client/server, telnet server, and a few other
nice items in it).

Cygwin is also a good option.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]
Sent: Tuesday, August 20, 2002 9:55 AM
To: [EMAIL PROTECTED]
Subject: Re: Is there a way to do cron on windows?


If you can write a Perl script to detect whether the program is running,
then to start it if it isn't, then you can run the Perl script in a loop,
using the 'sleep()' function to make it run every (up to you) seconds.

Just be aware that, if you use Task Scheduler in Windows, you will need to
schedule the job, then go into the GUI Task Scheduler interface, and tell
the script to "run as" a user with rights to do whatever it is you want to
do.  You will need to enter a password for that user.  Otherwise, when the
Task Scheduler launches the task, it will run as some kind of "system
user", who has basically no rights.  The task does NOT inherit the rights
of the user who scheduled the task.

Shawn




   

felix_geerinckx@h  

otmail.com   To: [EMAIL PROTECTED]

 cc:   

08/20/2002 10:16 bcc:Shawn Milochik/US/GODIVA/CSC  

AM   Subject: Re: Is there a way to do 
cron on windows?
   

   






on Tue, 20 Aug 2002 14:05:19 GMT, [EMAIL PROTECTED] (Daryl J.
Hoyt) wrote:

>  I am looking for a way to run something like a cron job on
>  windows.



Take your pick.

--
felix

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]






**
This e-mail and any files transmitted with it may contain 
confidential information and is intended solely for use by 
the individual to whom it is addressed.  If you received
this e-mail in error, please notify the sender, do not 
disclose its contents to others and delete it from your 
system.

**


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: :Telnet question

2002-08-12 Thread Akens, Anthony

Have you tried using 

$ion->login($username, $passwd); 

Instead of doing it the hard way?

Here's a sample script I use with net::telnet that works just fine...
Might want to try modifying it.  I use a hosts file, because my script
hits a number of machines.  The only bit that might be confusing is
the regex for the prompt.  The prompt is in the format 

[user@host directory]

on the clients I am connecting to.

use strict;
my ( $hostname, $passwd, $username);

$username = "username";
$passwd = "password";

open (HOSTS, "hosts.txt")
or die "Could not open hosts file.";

while ()
{
chomp($hostname = $_);
$hostname =~ s/^\s+//;
$hostname =~ s/\s+$//;
#chomp ($hostname);
## Connect and login.
use Net::Telnet ();
$host = new Net::Telnet (Timeout => 30,
 Prompt => '/\[.*?\]\#/');  
$host->open($hostname);
$host->login($username, $passwd);

$host->waitfor(/\[.*?\]#/);

#do stuff here!
}
  
exit;


-Original Message-
From: Joe Mecklin [mailto:[EMAIL PROTECTED]]
Sent: Monday, August 12, 2002 11:13 AM
To: [EMAIL PROTECTED]
Subject: Net::Telnet question


Using Net::Telnet, I'm having trouble getting the server I'm connecting
to to recognize my input to the login prompt.  Below is the code so far
(all system/user details have been changed to protect the innocent):


#! /usr/bin/perl
#   tnp -   TelNet via Perl
#   trial program to replace Expect script

use Net::Telnet ();
$ion = new Net::Telnet
(
-host => "system.name.com",
-port => 1234,
-output_record_separator => "\r"
);

$ion->dump_log("ion_log");

$ion->binmode(1);
if ($ion->open())
{
print "Connected...\n";
}

if ($ion->waitfor(-string => "User ID:"))
{
print "Saw \"User ID:\" ...\n";
$ion->print("username");
}

if ($ion->waitfor(-string => "Password:"))
{
print "Saw \"Password:\" ...\n";
}

$ion->close();

$ion->dump_log;

# end of program

When I run this, after printing

Saw "Password:" ...

the connection times out every time.  Looking at ion_log, the last entry
is 
... various text and ANSI codes ...
username

indicating that it was successfully passed to the server.   I have tried
the ->print and ->put calls, with and without various combinations of \r
and \n, with and without binmode() on.  This same functionality
currently exists in a working Expect script (with \r appended to each
input) but I'd like to get it working in Perl as well.

Any ideas on what may be missing will be greatly appreciated.

Joe


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Standalone Perl App

2002-07-26 Thread Akens, Anthony

I wrote the following quick'n'dirty script that converts
a plain text file (with a little custom markup)
into a very basic html file.  It works great, and now 
I'd like to make it into a standalone app to use
on my other machines that do not have activestate perl 
installed (or pass to friends who don't have perl).

Is there a way to do that outside of purchasing the pdk?
Again, this is on win32...

Here's the script - feel free to offer comments on it, 
as well.  Always good to hear constructive criticism.

It's called at a command line by using:

convert.pl original.txt > finished.html

The conventions I'm using in the text document are:
[title]my_title[/title]
[link]my_docname[name]my_linkname[/name]

--

print "\n";
while (<>)
{
s/\n/\n/;
s/\[title\]/ /;
s/\[\/title\]/<\/c>/;
s/\[link\]//;
s/\[\/link\]/<\/a>/;
print; 
}
print "\n<\/body>";

--

-Tony

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: PM installation problem

2002-07-24 Thread Akens, Anthony

If you're using activestate perl on your windows box, just use ppm.

Just type ppm to enter the package manager, then 

install package:name

and it does the work for you.

-Tony

-Original Message-
From: Connie Chan [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, July 24, 2002 11:04 AM
To: [EMAIL PROTECTED]
Subject: PM installation problem


I 've just downloaded a "Imager" module, however, that's a .tar.gz file
and I am using Win Me. After I extract the file, there are quite sort of
files and dirs. So... how can I install it to my Perl lib or site/lib ?

Rgds, 
Connie


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: threads in perl

2002-07-24 Thread Akens, Anthony

perldoc thread

This turned up a library on how to thread processes in perl.

Seemed fairly straightforward.  (though it did say it was 
experimental, so there may be a better module out there?)

-Tony

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]
Sent: Wednesday, July 24, 2002 1:57 AM
To: [EMAIL PROTECTED]
Subject: threads in perl


Hello All,

I want to execute same perl procedure with different parameters at 
the almost same time in a perl program indefendantly. (next process 
should be started without waiting for the end of previous one)  The 
procedure is located in .pm file. How can I use thread in perl to get 
my work done?

regards
Rohana.

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Newbie Stupid Question!

2002-07-15 Thread Akens, Anthony

Some comments from a fellow "newbie"...  (and my first stab at being
helpful)

File::Copy is a great module, and can handle what you're trying to do.
as others have already said)

On an unrelated note, something that is an immense help to me when using
"or die" is to put $! in the die statement, for example:

or die "Can't open documents: $!";

The $! inserts the "human readable" error returned by the system, such as
"permission denied" or "file not found" which can help a lot in figuring out
what's broke.

-Tony

-Original Message-
From: Simopoulos [mailto:[EMAIL PROTECTED]]
Sent: Monday, July 15, 2002 1:19 PM
To: [EMAIL PROTECTED]
Subject: Newbie Stupid Question!


Hi All,
I'm a newbie just starting out learning Perl.
My problem is I have a bunch of files that are (.doc) files, and I want to rename
the files (.data).
I also want move then to another directory, but I don't really want to destroy or
change the old ones (.doc).
What I've done so far doesn't work the way I want it to.  It is:

#! /usr/bin/perl -w
opendir(DOCUMENTS,".") || die "Can't open directory documents!";
@filenames = readdir(DOCUMENTS);
closedir(DOCUMENTS);
foreach $filename(@filenames) {
   if ($filename =~ m/\.doc$/i) {
   rename($filename, "/home/marsie/data/$filename.dat") ||
  die "Can't move files";
   } else {
   print "Not a .doc file!\n";
   }
}

I would appreciate any help anyone can offer.
Peace,
Marsie


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Pattern Matching

2002-06-25 Thread Akens, Anthony

>This will work...

> /\[.*?\]/

Does exactly what I needed

>Probably more explaination than you wanted, but I felt like sharing :)

Explanations help me learn - maybe someday I'll be able to do these
on my own :)

>Rob 

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Pattern Matching

2002-06-25 Thread Akens, Anthony

I'm trying to find a way to match anything between two brackets []  The stuff in
between will have alpha, numeric, and symbols (including /  -  @ and spaces)

For instance

[akens@egh-org blah/blah/blah]

I need to match that entire string, including the []'s

Here's the ugly thing I've gotten so far to do it.  But I know there's a way to 
simplify
it and just match anything between the []'s.  Thanks for any advice.

\[[A-Za-z'-@]* \S+[A-Za-z'/]\]


Tony Akens

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Killing Idle Users

2002-06-13 Thread Akens, Anthony

This solution seems to kill anything, here's a snippet of the log.  As you can see, 
it's hitting things low idle times.  (I of course commented out the kill line)

mmorgan
killing process idPID
 80350
 on pts/0 because minutes equal 1
CLanko
killing process idPID
109034
 on pts/1 because minutes equal 0
lestas
killing process idPID
 17490
 23800
 51560
 on pts/2 because minutes equal 36
ruthw
killing process idPID
 24992
 26024
 30064
 on pts/3 because minutes equal 17
annap
killing process idPID
 28626
 34956
 51884
 on pts/4 because minutes greater than 60 1:35
annap
killing process idPID
 22890
 40942
115930
 on pts/5 because minutes equal 7

-Original Message-
From: John W. Krahn [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, June 12, 2002 9:18 PM
To: [EMAIL PROTECTED]
Subject: Re: Killing Idle Users


"John W. Krahn" wrote:
> 
> #!/usr/bin/perl -w
> use strict;
> 
> my $results = '/home/danb/killemresults';
> open RES, '>>', $results or die "Cannot open $results: $!";
> print "\n" . localtime() . "\nStarting\n";
> 
> for my $user ( map [ (split)[0,1,4] ], grep m|\bpts/|, `w -l` ) {
> # print RES "$user->[1]\n";
> my $pid = `ps -t $user->[1] -o pid`;
> print RES "$user->[0]\n";
> print RES "killing process id $pid on $user->[1] because minutes ";
> print RES $user->[2] =~ /:/ ? 'greater than 60' : 'equal', " $user->[2]\n";
> kill 9, $pid;
> }


Sorry, there could be multiple pids, so it should be:

for my $user ( map [ (split)[0,1,4] ], grep m|\bpts/|, `w -l` ) {
# print RES "$user->[1]\n";
my @pids = grep s/\D+//g&&/\d/, `ps -t $user->[1] -o pid`;
print RES "$user->[0]\n";
print RES "killing process id @pids on $user->[1] because minutes ";
print RES $user->[2] =~ /:/ ? 'greater than 60' : 'equal', " $user->[2]\n";
kill 9, @pids;
}


John
-- 
use Perl;
program
fulfillment

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Killing Idle Users

2002-06-12 Thread Akens, Anthony

Using export TMOUT kills the tty, but the application 
(lovely thing that it is) stays up in the background, 
unattached.  Thus creating an even more difficult beast to 
track down and, well, kill.  IBM's response?  (In summary) Tell 
the vendor their app is not responding correctly, and have em 
fix it.  Or write a shell script to logout the idle users.

Hence our original script, and my attempt to convert it to perl.

Tony Akens

-Original Message-
From: Jeff [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, June 11, 2002 6:45 PM
To: [EMAIL PROTECTED]
Subject: Re: Killing Idle Users


Why can't you just put

   export TMOUT=3600
   readonly TMOUT

in /etc/profile ?

If it doesn't work, contact the vendor (IBM for AIX) for a patch.



--- "Akens, Anthony" <[EMAIL PROTECTED]> wrote:
> Hello,
> I'm a sys-admin on an AIX (4.3) machine, and I'm trying to work 
> with a vendor program that doesn't behave very nicely.  Basically,
> if a user's connection to the server is inappropriately severed
> the application keeps right on chugging, leaving the user 
> logged in.  By inappropriately severed I mean if our link to the
> site the user is at goes down, which happens more often then 
> I'd like, but that's a different problem.
> 
> Anyway, back to the question...
> There are quite a few problems with these unconnected sessions 
> being stuck out there, not least of which is that they eat up a 
> user license, which are pretty limited in amount.  One good power 
> outage or link failure at can make it so we're out of licenses.
> In that event the vendor says "Kill them by hand, do NOT automate 
> the process" which is all fine and good except that we have 
> several hundred users.  That's a lot of time.
> 
> The first thought was to use a shell idle logout time, but the
> program doesn't respond to the request and remains logged in.
> 
> The previous administrator had made a shell script to check idle 
> times and automatically log out the users, however since I've 
> taken over the box the uptimes have been far greater (I don't 
> believe in the same "reboot often" premise that he did) and now 
> his script has shown a severe weakness: It's not written to 
> handle longer PIDs.
> 
> Rather then trying to fix his shell script, which is rather 
> convoluted, I thought it would be a perfect candidate for perl, 
> which I am just learning.  If I post the script, can I get some 
> pointers on re-writing it?  Thanks for your time
> 
> Tony Akens
> [EMAIL PROTECTED]
> 
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> 



__
Do You Yahoo!?
Yahoo! - Official partner of 2002 FIFA World Cup
http://fifaworldcup.yahoo.com

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Killing Idle Users

2002-06-11 Thread Akens, Anthony

Found another problem...

The line 
 my @inputP = qx!ps -e |grep $port!;
is matching a bit overzealously...  For instance,
if $port = pts/6 it matches (and therefore would kill)
not only processes on port pts/6, but also pts/61, pts/62
etc.

Is there a better way to limit that?

Tony Akens
[EMAIL PROTECTED]

-Original Message-
From: Akens, Anthony 
Sent: Tuesday, June 11, 2002 1:49 PM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: RE: Killing Idle Users


Thanks!
The only thing I saw to change was the line

if ( $idle =~ /^\d+$/ && $idle > 0 )

to

if ( $idle =~ /^\d+$/ && $idle > $USER_IDLE )

I'm going to do some testing to be safe, but will probably
put this in place soon - and the great part is that I
understand *most* of it!

Tony Akens
[EMAIL PROTECTED]


-Original Message-
From: Craig Moynes/Markham/IBM [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, June 11, 2002 1:00 PM
To: Akens, Anthony
Cc: [EMAIL PROTECTED]
Subject: RE: Killing Idle Users


Something like this perhaps ?:

#!/usr/bin/perl -w
use strict;

my $USER_IDLE = 40;

my @input = qx!w -l!;
shift @input;
shift @input;

foreach my $entry ( @input )
{
my ($user, $port, undef, $idle) = split('\s+', $entry);

if ( $idle =~ /^\d+$/ && $idle > 0 )
{
# Kill all the users processes
my @inputP = qx!ps -e |grep $port!;
foreach my $entryP ( @inputP )
{
my ($pid) = (split('\s+', $entryP))[1];
print "killing process $pid. ".
"user $user idle for $idle.\n";
#qx!kill -9 $pid!;
}
}
elsif ( $idle =~ /:/ )
{
# Kill all the users processes
my @inputP = qx!ps -e |grep $port!;
foreach my $entryP ( @inputP )
{
my ($pid) = (split('\s+', $entryP))[1];
print "killing process $pid ".
"idle for $idle.\n";
#qx!kill -9 $pid!;
}
}

}


-
Craig Moynes
[EMAIL PROTECTED]



       

  "Akens, Anthony" 

  <[EMAIL PROTECTED]> To:   <[EMAIL PROTECTED]>  

   cc: 

  06/11/02 01:13 PMSubject:  RE: Killing Idle Users

  Please respond to    

  "Akens, Anthony" 

   

   




Here's the shell script, for those interested.  Sorry for the
lack of comments, but it's not to hard to figure out with a
little patience...

---
PTS=`w -l | grep pts | cut -c10-15`
echo "" >> /home/danb/killemresults
RIGHTNOWDATE=`date`
echo $RIGHTNOWDATE >> /home/danb/killemresults
echo "Starting" >> /home/danb/killemresults
for i in $PTS
do
 #echo $i >> /home/danb/killemresults
MINUTES=`w -l | grep -w $i | cut -c38-39`
 for j in $MINUTES
 do
 if (($j > 40))
 then
 PID=`ps -e | grep -w $i' ' | cut -c2-6`
USERNAME=`w -l | grep -w $i`
echo $USERNAME >> /home/danb/killemresults
 echo 'killing process id ' $PID ' on ' $i '
because minutes equal ' $MINUTES >> /home/danb/killemresults
 kill -9 $PID
 fi
 done
 MINUTES=`w -l | grep -w $i | cut -c37`
 for j in $MINUTES
 do
 if [ "$j" = ":" ]
 then
 PID=`ps -e | grep -w $i' ' | cut -c2-6`
USERNAME=`w -l | grep -w 

RE: Killing Idle Users

2002-06-11 Thread Akens, Anthony

Thanks!
The only thing I saw to change was the line

if ( $idle =~ /^\d+$/ && $idle > 0 )

to

if ( $idle =~ /^\d+$/ && $idle > $USER_IDLE )

I'm going to do some testing to be safe, but will probably
put this in place soon - and the great part is that I
understand *most* of it!

Tony Akens
[EMAIL PROTECTED]


-Original Message-
From: Craig Moynes/Markham/IBM [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, June 11, 2002 1:00 PM
To: Akens, Anthony
Cc: [EMAIL PROTECTED]
Subject: RE: Killing Idle Users


Something like this perhaps ?:

#!/usr/bin/perl -w
use strict;

my $USER_IDLE = 40;

my @input = qx!w -l!;
shift @input;
shift @input;

foreach my $entry ( @input )
{
my ($user, $port, undef, $idle) = split('\s+', $entry);

if ( $idle =~ /^\d+$/ && $idle > 0 )
{
# Kill all the users processes
my @inputP = qx!ps -e |grep $port!;
foreach my $entryP ( @inputP )
{
my ($pid) = (split('\s+', $entryP))[1];
print "killing process $pid. ".
"user $user idle for $idle.\n";
#qx!kill -9 $pid!;
}
}
elsif ( $idle =~ /:/ )
{
# Kill all the users processes
my @inputP = qx!ps -e |grep $port!;
foreach my $entryP ( @inputP )
{
my ($pid) = (split('\s+', $entryP))[1];
print "killing process $pid ".
"idle for $idle.\n";
#qx!kill -9 $pid!;
}
}

}


-
Craig Moynes
[EMAIL PROTECTED]



       

  "Akens, Anthony" 

  <[EMAIL PROTECTED]> To:   <[EMAIL PROTECTED]>  

   cc: 

  06/11/02 01:13 PMSubject:  RE: Killing Idle Users

  Please respond to    

  "Akens, Anthony" 

   

   




Here's the shell script, for those interested.  Sorry for the
lack of comments, but it's not to hard to figure out with a
little patience...

---
PTS=`w -l | grep pts | cut -c10-15`
echo "" >> /home/danb/killemresults
RIGHTNOWDATE=`date`
echo $RIGHTNOWDATE >> /home/danb/killemresults
echo "Starting" >> /home/danb/killemresults
for i in $PTS
do
 #echo $i >> /home/danb/killemresults
MINUTES=`w -l | grep -w $i | cut -c38-39`
 for j in $MINUTES
 do
 if (($j > 40))
 then
 PID=`ps -e | grep -w $i' ' | cut -c2-6`
USERNAME=`w -l | grep -w $i`
echo $USERNAME >> /home/danb/killemresults
 echo 'killing process id ' $PID ' on ' $i '
because minutes equal ' $MINUTES >> /home/danb/killemresults
 kill -9 $PID
 fi
 done
 MINUTES=`w -l | grep -w $i | cut -c37`
 for j in $MINUTES
 do
 if [ "$j" = ":" ]
 then
 PID=`ps -e | grep -w $i' ' | cut -c2-6`
USERNAME=`w -l | grep -w $i`
echo $USERNAME >> /home/danb/killemresults
 echo 'killing process id ' $PID ' on ' $i '
because minutes greater than 60 ' $MINUTES >> /home/danb/killemresults
kill -9 $PID
         fi
 done
done

-

-Original Message-
From: [EMAIL PROTECTED]
Sent: Tuesday, June 11, 2002 12:15 PM
To: Akens, Anthony; [EM

RE: Killing Idle Users

2002-06-11 Thread Akens, Anthony

Here's the shell script, for those interested.  Sorry for the
lack of comments, but it's not to hard to figure out with a
little patience...

---
PTS=`w -l | grep pts | cut -c10-15`
echo "" >> /home/danb/killemresults
RIGHTNOWDATE=`date`
echo $RIGHTNOWDATE >> /home/danb/killemresults
echo "Starting" >> /home/danb/killemresults
for i in $PTS
do
#echo $i >> /home/danb/killemresults
MINUTES=`w -l | grep -w $i | cut -c38-39` 
for j in $MINUTES
do
if (($j > 40))  
then
PID=`ps -e | grep -w $i' ' | cut -c2-6`
USERNAME=`w -l | grep -w $i` 
echo $USERNAME >> /home/danb/killemresults
echo 'killing process id ' $PID ' on ' $i ' because minutes equal 
' $MINUTES >> /home/danb/killemresults
kill -9 $PID 
fi
done
MINUTES=`w -l | grep -w $i | cut -c37`
for j in $MINUTES
do
if [ "$j" = ":" ] 
then
PID=`ps -e | grep -w $i' ' | cut -c2-6`
USERNAME=`w -l | grep -w $i`
echo $USERNAME >> /home/danb/killemresults
echo 'killing process id ' $PID ' on ' $i ' because minutes 
greater than 60 ' $MINUTES >> /home/danb/killemresults
           kill -9 $PID
fi
done
done

-

-Original Message-
From: [EMAIL PROTECTED] 
Sent: Tuesday, June 11, 2002 12:15 PM
To: Akens, Anthony; [EMAIL PROTECTED]
Subject: RE: Killing Idle Users


I am not in a position to offer advice but would love to see it .




-Original Message-
From: Akens, Anthony [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, June 11, 2002 10:48 AM
To: [EMAIL PROTECTED]
Subject: Killing Idle Users


Hello,
I'm a sys-admin on an AIX (4.3) machine, and I'm trying to work 
with a vendor program that doesn't behave very nicely.  Basically,
if a user's connection to the server is inappropriately severed
the application keeps right on chugging, leaving the user 
logged in.  By inappropriately severed I mean if our link to the
site the user is at goes down, which happens more often then 
I'd like, but that's a different problem.

Anyway, back to the question...
There are quite a few problems with these unconnected sessions 
being stuck out there, not least of which is that they eat up a 
user license, which are pretty limited in amount.  One good power 
outage or link failure at can make it so we're out of licenses.
In that event the vendor says "Kill them by hand, do NOT automate 
the process" which is all fine and good except that we have 
several hundred users.  That's a lot of time.

The first thought was to use a shell idle logout time, but the
program doesn't respond to the request and remains logged in.

The previous administrator had made a shell script to check idle 
times and automatically log out the users, however since I've 
taken over the box the uptimes have been far greater (I don't 
believe in the same "reboot often" premise that he did) and now 
his script has shown a severe weakness: It's not written to 
handle longer PIDs.

Rather then trying to fix his shell script, which is rather 
convoluted, I thought it would be a perfect candidate for perl, 
which I am just learning.  If I post the script, can I get some 
pointers on re-writing it?  Thanks for your time

Tony Akens
[EMAIL PROTECTED]

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Killing Idle Users

2002-06-11 Thread Akens, Anthony

Hello,
I'm a sys-admin on an AIX (4.3) machine, and I'm trying to work 
with a vendor program that doesn't behave very nicely.  Basically,
if a user's connection to the server is inappropriately severed
the application keeps right on chugging, leaving the user 
logged in.  By inappropriately severed I mean if our link to the
site the user is at goes down, which happens more often then 
I'd like, but that's a different problem.

Anyway, back to the question...
There are quite a few problems with these unconnected sessions 
being stuck out there, not least of which is that they eat up a 
user license, which are pretty limited in amount.  One good power 
outage or link failure at can make it so we're out of licenses.
In that event the vendor says "Kill them by hand, do NOT automate 
the process" which is all fine and good except that we have 
several hundred users.  That's a lot of time.

The first thought was to use a shell idle logout time, but the
program doesn't respond to the request and remains logged in.

The previous administrator had made a shell script to check idle 
times and automatically log out the users, however since I've 
taken over the box the uptimes have been far greater (I don't 
believe in the same "reboot often" premise that he did) and now 
his script has shown a severe weakness: It's not written to 
handle longer PIDs.

Rather then trying to fix his shell script, which is rather 
convoluted, I thought it would be a perfect candidate for perl, 
which I am just learning.  If I post the script, can I get some 
pointers on re-writing it?  Thanks for your time

Tony Akens
[EMAIL PROTECTED]

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Outlook Contacts

2001-12-11 Thread Akens, Anthony

Is there a Module available that will let me read in a contacts list
from an outlook .pst file, or off of an exchange server, make changes to
it, and write it back?  The area code in my state is changing, and it
would be nice to be able to script this using a regex.  Thanks in
advance for any info.


Tony

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




New to Perl

2001-07-05 Thread Akens, Anthony

Hello,
I'm completely new to perl, but have a good deal of experience with c / c++.
I'm wondering if there are any good websites that will give me a basic
introduction to programming in perl (data structures, etc).  I've found
references to a few books, but would like to gather some basic info from the
net before spending the cash.  Thanks in advance for any info.

Tony Akens